Stephen Hawking, Noam Chomsky and thousands of others sign open letter calling for a ban on 'killer robots'

The letter claims that totally autonomous killing machines could become a reality within 'years, not decades'

Doug Bolton
Monday 27 July 2015 17:25 EDT
Comments
The letter warns that quadcopters such as this could be used to autonomously attack targets
The letter warns that quadcopters such as this could be used to autonomously attack targets (Joern Haufe/Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

More than 1,000 robotics experts and artificial intelligence (AI) researchers - including physicist Stephen Hawking, technologist Elon Musk, and philosopher Noam Chomsky - have signed an open letter calling for a ban on "offensive autonomous weapons", or as they are better known, 'killer robots'.

Other signatories include Apple co-founder Steve Wozniak, and hundreds of AI and robotics researcher from top-flight universities and laboratories worldwide.

The letter, put together by the Future of Life Institute, a group that works to mitigate "existential risks facing humanity", warns of the danger of starting a "military AI arms race".

These robotic weapons may include armed drones that can search for and kill certain people based on their programming, the next step from the current generation of drones, which are flown by humans who are often thousands of miles away from the warzone.

The letter says: "AI technology has reached a point where the deployment of such systems is - practically if not legally - feasible within years, not decades."

It adds that autonomous weapons "have been described as the third revolution in warfare, after gunpowder and nuclear arms".

It says that the Institute sees the "great potential [of AI] to benefit humanity in many ways", but believes the development of robotic weapons, which it said would prove useful to terrorists, brutal dictators, and those wishing to perpetrate ethnic cleansing, is not.

Such weapons do not yet truly exist, but the technology that would allow them to be used is not far away. Opponents, like the signatories to the letter, believe that by eliminating the risk of human deaths, robotic weapons (the technology for which will become cheap and ubiquitous in coming years), would lower the threshold for going to war - potentially making wars more common.

Sentry robots like these are currently in use by South Korea along the North Korean border - but cannot fire their weapons without human input
Sentry robots like these are currently in use by South Korea along the North Korean border - but cannot fire their weapons without human input (KIM DONG-JOO/AFP/Getty Images)

Last year, South Korea unveiled similar weapons - armed sentry robots, that are currently installed along the border with North Korea. Their cameras and heat sensors allow them to detect and track humans automatically, but the machines need a human operator to fire their weapons.

The letter also warns of the possible public image impact on peaceful the uses of AI, which potentially could bring significant benefit to humanity. By building robotic weapons, it warns that a public backlash could grow, curtailing the genuine benefits of AI.

It sounds very futuristic, but this field of technology is advancing at a rapid rate, and opposition to the violent use of AI is already growing.

Physicist Stephen Hawking is one of the more famous signatories to the letter
Physicist Stephen Hawking is one of the more famous signatories to the letter (NIKLAS HALLE'N/AFP/Getty Images)

The Campaign to Stop Killer Robots, a group formed in 2012 by a list of NGOs including Human Rights Watch, works to preemptively ban robotic weapons.

They are currently working to get the issue of robotic weapons on the table of the Convention of Conventional Weapons in Geneva, a UN-linked group that seeks to prohibit the use of certain conventional weapons such as landmines and laser weapons, which, like the Campaign hopes autonomous weapons will be, were preemptively banned in 1995.

The Campaign is trying to get the Convention to set up a group of governmental experts which would look into the issue, with the aim of having such weapons banned.

Earlier this year, the UK opposed a ban on killer robots at a UN conference, with a Foreign Office official telling The Guardian that they "do not see the need for a prohibition" of autonomous weapons, adding that the UK is not developing any such weapons.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in