Ban Killer Robots Before They Take Over, Stephen Hawking & Elon Musk Say

killer robot
(Image credit: Digital Storm/Shutterstock.com)

A global arms race to make artificial-intelligence-based autonomous weapons is almost sure to occur unless nations can ban the development of such weapons, several scientists warn.

Billionaire entrepreneur Elon Musk, physicist Stephen Hawking and other tech luminaries have signed an open letter warning against the dangers of starting a global arms race of artificial intelligence (AI) technology unless the United Nations supports a ban on weapons that humans "have no meaningful control over."

The letter, which was issued by the Future of Life organization, is being presented today (July 27) at the International Joint Conference On Artificial Intelligence in Buenos Aires, Argentina. [Super-Intelligent Machines: 7 Robotic Futures]

"The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter reads, referring to the automatic weapons.

The risks, the signatories say, could be far greater than those posed by nuclear weapons.

Rise of the machines

From self-driving cars to sex bots, more and more of humanity's fundamental tasks are being taken over by robots. The inevitable march of the machines has spurred both utopian and apocalyptic visions of the future. Rogue AI that threatens humanity has featured prominently in science fiction movies such as "The Matrix" and "2001: A Space Odyssey."

But increasingly, these fears aren't just being played out on the silver screen. In fact, artificial-intelligence researchers themselves have voiced concerns over how innovations in the field are being developed. With autonomous AI weapons — such as drone planes that could seek and kill people using a face-recognition algorithm — the technology could be here in a matter of years, the writers of the letter argue.

And while drone fighters could limit battlefield casualties, these autonomous bots could also lower the threshold for initiating conflicts in the first place, the letter states.

In addition, such automatic weapons could conceivably be in the hands of almost every military power on Earth, because AI-based killing machines wouldn't require costly or hard-to-obtain materials. It wouldn't be long before assassins, terrorists and other bad actors could purchase them on the black market and use them for nefarious purposes, the scientists wrote.

"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity," the letter states. 

This isn't the first time the science and tech luminaries have warned against the dangers of AI. In 2014, Hawking said the development of full artificial intelligence could spell the end of the human race. Both Hawking and Musk signed a letter by the same organization in January warning that AI holds great dangers unless humanity can ensure that AI systems "will do what we want them to."

Follow Tia Ghose on Twitter and Google+. Follow Live Science @livescience, Facebook & Google+. Original article on Live Science.

Tia Ghose
Managing Editor

Tia is the managing editor and was previously a senior writer for Live Science. Her work has appeared in Scientific American, Wired.com and other outlets. She holds a master's degree in bioengineering from the University of Washington, a graduate certificate in science writing from UC Santa Cruz and a bachelor's degree in mechanical engineering from the University of Texas at Austin. Tia was part of a team at the Milwaukee Journal Sentinel that published the Empty Cradles series on preterm births, which won multiple awards, including the 2012 Casey Medal for Meritorious Journalism.