Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Top universities pledge to incorporate AI use in teaching: ‘Opportunity rather than a threat’

‘The rapid rise of generative AI will mean we need to continually review and re-evaluate our assessment practices’

Eleanor Busby
Monday 03 July 2023 23:24 EDT
The Russell Group hopes to support ethical and responsible use of software like ChatGPT
The Russell Group hopes to support ethical and responsible use of software like ChatGPT (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Universities will adapt teaching and assessment for students to incorporatethe “ethical” use of generative artificial intelligence, leading institutions have said.

The Russell Group, which includes many of the most selective universities in the UK, has published a set of principles to help universities capitalise on the opportunities that artificial intelligence (AI) offers to education.

The statement, backed by the vice-chancellors of the 24 Russell Group universities, hopes to support the ethical and responsible use of software like ChatGPT while ensuring that academic integrity is upheld.

ChatGPT is a form of generative AI that can respond to questions in a human-like manner and understand the context of follow-up queries, much like in human conversations, as well as being able to compose essays if asked – sparking fears it could be used by students to complete assignments.

But the Russell Group statement suggests that incorporating generative AI tools into teaching and assessments “has the potential to enhance the student learning experience, improve critical-reasoning skills and prepare students for the real-world applications” of generative AI technologies.

It said: “All staff who support student learning should be empowered to design teaching sessions, materials and assessments that incorporate the creative use of generative AI tools where appropriate.”

The principles have been published after education secretary Gillian Keegan launched a call for evidence last month on how generative AI could be used “in a safe and secure way” in education settings.

It came after the UK’s major exam boards suggested in March that schools should make pupils do some of their coursework “in class under direct supervision” amid cheating fears in the context of AI use.

All of the Russell Group universities have reviewed their academic conduct policies to reflect the emergence of generative AI and these policies make it clear to students when its use is “inappropriate”, according to the statement.

“Ensuring academic integrity and the ethical use of generative AI can also be achieved by cultivating an environment where students can ask questions about specific cases of their use and discuss the associated challenges openly and without fear of penalisation,” it said.

Professor Andrew Brass, head of the School of Health Sciences at the University of Manchester, said: “We know that students are already utilising this technology, so the question for us as educators is how do you best prepare them for this, and what are the skills they need to have to know how to engage with generative AI sensibly?

“From our perspective, it’s clear that this can’t be imposed from the top down, but by working really closely with our students to co-create the guidance we provide.

“If there are restrictions for example, it’s crucial that it’s clearly explained to students why they are in place, or we will find that people find a way around it.”

He added: “Assessment will also need to evolve – as it has always done in response to new technology and workforce skills needs – to assess problem-solving and critical-reasoning skills over knowledge recall.”

Professor Michael Grove, deputy pro-vice-chancellor (Education Policy & Standards) at the University of Birmingham, said: “Generative AI offers potential beyond a role as an intelligent information retrieval tool, and could be used to support the development of stylistic writing skills or to make learning materials more accessible and inclusive for students from different cultural or linguistic backgrounds.

“The rapid rise of generative AI will mean we need to continually review and re-evaluate our assessment practices, but we should view this as an opportunity rather than a threat.”

Dr Tim Bradshaw, chief executive of the Russell Group, said: “AI breakthroughs are already changing the way we work and it’s crucial students get the new skills they need to build a fulfilling career.

“The transformative opportunity provided by AI is huge and our universities are determined to grasp it. This statement of principles underlines our commitment to doing so in a way that benefits students and staff and protects the integrity of the high-quality education Russell Group universities provide.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in