We’ve all seen the films where the machines take over. Terminator, The Matrix, 2001, there’s even the 1998 B film classic ‘Dream House’ where a smart home goes rogue with (unintentionally) funny consequences.

But of course we all know that AI is really here to help humanity. As our twenty-first century machine learning systems munch away on all that data from our IoT sensors, they will soon offer us with the answers and the actions to our next thought before it has even occurred to us.

The machines taking over is all pure fantasy. Right?

Well in the last few months two distinguished and highly respected men have spoken out about the potential risk of AI to the human race.

Stephen Hawking told the BBC:

“The development of full synthetic intelligence could spell the end of the human race… Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”

While Elon Musk warned:

“I think we ought to be very careful about synthetic intelligence. If I had to guess at what our most significant existential risk is, it’s probably that. So we need to be very careful.”

When you look around at the mildy terrifying military robots starting to emerge, the sudden proliferation of drones and the impeding self driving automobile revolution, it’s possibly no longer a substantial stretch of the imagination to fast forward to a world where humans are no longer at the top of the food chain.

Both Hawking and Musk have signed this open letter to highlight awareness of their belief that the impact on society from AI is likely to increase and with it the need to guarantee it’s serving man, not the machines.

So is it time to introduce regulatory oversight of AI systems to protect us all?  Could Skynet really be a thing, or are you just pleased that your lights come on automatically when you get home? let us know what you think in the comments below.

More Reading:  Research priorities for robust and useful synthetic intelligence

Share this:




Leave a Reply

Your email address will not be published.