Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Disturbing research warns AI may be ‘Great Filter’ that wipes out human civilisation

Advanced AI could behave as a ‘second intelligent species’ with whom humans eventually share Earth

Vishwam Sankaran
Friday 12 May 2023 02:09 EDT
Comments
Related video: He helped create AI. Now he’s worried it will destroy humanity

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Advanced artificial intelligence could pose a catastrophic risk to humanity and wipe out entire civilisations, a new study warns.

AI could be the potential “Great Filter” answer to the Fermi paradox, with the potential to wipe out intelligent life in the universe before it can make contact with others, suggested the yet-to-be peer-reviewed study, posted in the arXiv preprint server.

The Fermi Paradox, captured popularly by the phrase “Where is everybody?”, has puzzled scientists for decades. It refers to the disquieting idea that if extraterrestrial life is probable in the universe, then why have humans not encountered it yet.

Many theories have been proposed, offering different explanations to our solitary presence so far in the cosmos.

Even though probability calculations, such as the popular Drake Equation, suggest there could be number of intelligent civilisations in the galaxy, there is still a puzzling cosmic silence.

One popular hypothesis – known as the Great Filter – suggests that some event required for the emergence of intelligent life is extremely unlikely, hence the cosmic silence.

A logical equivalent of this theory is that some catastrophic cosmic phenomenon is likely preventing life’s expansion throughout the universe.

“This could be a naturally occurring event, or more disconcertingly, something that intelligent beings do to themselves that leads to their own extinction,” wrote study author Mark Bailey from the National Intelligence University (NIU) in the US.

The new research theorised that AI advancement may be the exact kind of catastrophic risk event that could potentially wipe out entire civilisations.

In the study, Dr Bailey frames the context of the Great Filter within the potential long-term risk of technologies like AI that we don’t fully understand.

“Humans are terrible at intuitively estimating long-term risk,” the NIU scientist said, adding that we do not fully understand AI, yet “it is rapidly infiltrating our lives”.

“Future AI will likely tend toward more generalizable, goal-directed systems with more meaningful control, where the consequences of unintended outcomes will become significantly more severe,” he warned.

Dr Bailey posited what he calls the “second species argument”, which raises the possibility that advanced AI could effectively behave as a “second intelligent species” with whom we would eventually share this planet.

Considering what happened when modern humans and Neanderthals coexisted on Earth, NIU researchers said the “potential outcomes are grim”.

“It stands to reason that an out-of-control technology, especially one that is goal-directed like AI, would be a good candidate for the Great Filter,” Dr Bailey wrote in the study.

“We must ask ourselves; how do we prepare for this possibility?”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in