AI expert: We’ve got to stop talking about Skynet and Terminator

12 Apr 2016

According to a leading AI expert, a future in which AI can reach its full potential could be scuppered if those putting forward an apocalyptic outcome continue to create an aura of fear around the technology.

AI has managed to find its way into many facets of our daily life, from the technology behind our personal assistants in our smartphones, to the development of autonomous cars that will one day take to the road.

But there is no denying that there remains a continuing fear about AI technology, put forward not just by the producers of Hollywood blockbusters set to profit from the idea of robots taking over the world, but by some of the most respected figures in science and technology.

At the beginning of last year, Elon Musk and Stephen Hawking were two of the most famous figures from the Future of Life Institute (FLI) to sign an open letter calling for greater regulation of AI technology over fears of it one day running amok.

Now, according to The Guardian, one of the signatories of that open letter, Chris Bishop, director of Microsoft Research in Cambridge, is saying this type of talk might actually be detrimental to the possibilities of AI.

‘Throwing the baby out with the bathwater’

Saying he completely disagreed with the opinions put forward by Musk and Hawking, Bishop added: “The danger I see is if we spend too much of our attention focusing on Terminators and Skynet and the end of humanity – or generally just painting a too negative, emotive and one-sided view of artificial intelligence – we may end up throwing the baby out with the bathwater.”

He went on to say that even the possibility of a form of AI, based on our current technology becoming sentient and shedding its human master is decades away at a minimum and, for the moment, “we are in control”.

Given that he is a signatory of the open letter published last year, however, Bishop did add that he believes the technology’s development shouldn’t go without any regulation, citing the familiar phrase, with great power comes great responsibility.

“It is a very powerful technology, potentially one of the most powerful humanity has ever created, with enormous potential to bring societal benefits.

“But any very powerful, very generic technology will carry with it some risks.”

Robot skull image via Shutterstock

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com