Google's decision to build AI for Pentagon drones divides company

Use of Silicon Valley search giant's TensorFlow technology in unmanned flights over warzones leaves employees questioning defence role

Joe Sommerlad
Thursday 08 March 2018 07:09 EST
Comments
What is Project Maven?: the Pentagon's AI project with Google

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The US Military is using artificial intelligence software developed by Google in one of its drone programmes, causing internal divisions over how the company should be run.

The Department of Defense's Project Maven, commenced last April, utilises the Silicon Valley search giant's TensorFlow AI to analyse the hours of footage shot by unmanned planes.

TensorFlow scans the film for objects of interest and flags them for human analysts with a view to further investigation.

It has reportedly already been used in the field to survey areas held by Isis in the Middle East but Google stresses the technology is being deployed for "non-offensive uses only."

Employees have nevertheless raised concerns about the company's role in defence contracting after it was revealed last week on an internal mailing list, particularly in light of the company's famous founding principle: "Don't be evil."

Many have expressed disquiet internally about the software they helped develop being signed over for surveillance, according to Gizmodo.

"Military use of machine learning naturally raises valid concerns," Google said in a statement.

"We're actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies."

The company has worked with the US Military in the past and senior executives including Eric Schmidt and Milo Medin have advised the armed forces on cloud and data systems as part of the Defense Innovation Board.

Google also oversaw the development of the BigDog robotic packhorse, built by Boston Dynamics when it was owned by parent corporation Alphabet in 2015. Originally conceived in 2005, the stalking quadruped was repurposed to assist the Marine Corps before ultimately being abandoned on the grounds that it was too noisy for stealth combat manoeuvres.

The Pentagon spent $7.4bn (£5.3bn) on AI and data processing tech in 2017, according to The Wall Street Journal, as global warfare becomes ever more remote and tech-centric.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in