Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Cambridge University launches master’s degree in responsible use of AI

Course is designed for professionals who want to ensure their products do no harm

Sam Russell
Press Association
Sunday 06 December 2020 08:49 EST
Comments
Groundbreaking new material 'could allow artificial intelligence to merge with the human brain'

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Cambridge University is launching a master's degree course in the responsible use of artificial intelligence, which it says will be the UK's first.

The qualification is designed for professionals, such as technology makers, wanting to ensure their products do no harm.

While AI is popularised in science fiction by killer robots – such as The Terminator – it is now in everyday use in forms such as Amazon's Alexa virtual assistant, facial identification, and Google maps.

Potential downsides of AI include that it can embed sexism, as shown when a recruiting tool's algorithm automatically downgraded women applicants.

It can also be used for intrusive surveillance, using facial recognition algorithms that decide who is a potential criminal.

The course, led by Cambridge University's Leverhulme Centre for the Future of Intelligence (CFI), aims to help leaders steer society towards a future where AI is used for good, not ill.

Dr Stephen Cave, executive director of the CFI, said: "Everyone is familiar with the idea of AI rising up against us.

"It's been a staple of many celebrated films like Terminator in the 1980s, 2001: A Space Odyssey in the 1960s, and Westworld in the 1970s, and more recently in the popular TV adaptation.

"But there are lots of risks posed by AI that are much more immediate than a robot revolt.

"There have been several examples which have featured prominently in the news, showing how it can be used in ways that exacerbate bias and injustice.

"It's crucial that future leaders are trained to manage these risks so we can make the most of this amazing technology.

"This pioneering new course aims to do just that."

The two-year course, delivered in partnership with Cambridge University's Institute for Continuing Education, is offered on a part-time basis.

The curriculum will span a range of academic areas including philosophy, machine learning, policy, race theory, design, computer science, engineering, and law.

"People are using AI in different ways across every industry, and they are asking themselves, 'How can we do this in a way that broadly benefits society?"' said Dr Cave.

"We have brought together cutting-edge knowledge on the responsible and beneficial use of AI, and want to impart that to the developers, policymakers, businesspeople and others who are making decisions right now about how to use these technologies."

Applications for the master of studies in AI ethics and society open this month, with the first cohort commencing in October 2021.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in