UK government backs international development of 'killer robots' at UN
Autonomous weapons can destroy without human intervention.
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The UK is opposing an international ban on lethal autonomous weapons systems (Laws) at a United Nations conference this week, so-called 'killer robots' that can select and destroy targets without human input.
Foreign Office and Ministry of Defence experts are over in Geneva for a week-long discussion over the use of computing and AI in combat, as the Campaign to Stop Killer Robots, an alliance of scientists and human rights activists, calls for autonomous weapons to be banned.
"At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area," a Foreign Office spokesperson told The Guardian.
"The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."
The same cannot be said for Israel's Iron Dome, the US's Phalanx and more, which automatically respond to threats.
The conference will also question whether emotionless machines may be advantageous in combat situations as they do not feel fear, hate or have a sense of morality.
It comes at a time of increasing concern over artificial intelligence running away from its human creators, with Google recently patenting robots with personalities that can be imbued with, amongst other things, 'fear and derision'.
"Giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology," said the Campaign to Stop Killer Robots. "Human control of any combat robot is essential to ensuring both humanitarian protection and effective legal control."
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments