Wikipedia releases artificially intelligent 'X-ray specs' tool to detect fake and damaging edits

Wikipedia could be about to become a lot more accurate

Doug Bolton
Friday 04 December 2015 07:34 EST
Comments
Wikipedia's new AI tool could make it more accurate
Wikipedia's new AI tool could make it more accurate (Peter Macdiarmid/Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Wikipedia is an incredibly useful tool, but since anyone can edit and change almost any article, it's often had to deal with accusations of inaccuracy.

These problems could soon be a thing of the past, thanks to a new 'articifically intelligent' robot that can detect bad or malicious edits and alert Wikipedia's human editors so they can take action.

The new tool is called the Objective Revision Evaluation Services (ORES), and Wikipedia says it functions like a pair of 'X-ray specs' - combing through the edits made on the site and detecting which ones are low quality based on the language used and the context in which the edits are made.

10 edits are made on Wikipedia every second - even though the site has thousands of volunteer editors, there's no way they can properly keep up with the torrent.

The bot can flag up potentially damaging edits, allowing humans to make the decision on whether to keep them or not.

As Wikipedia says: "By combining open data and open source machine learning algorithms, our goal is to make quality control in Wikipedia more transparent, auditable, and easy to experiment with."

ORES has been in the testing stages for a few months, and it's already seen success - now, Wikipedia is opening it up to everyone.

Wikipedia already has automated tools to keep track of potentially damaging edits, but they haven't been that smart - they tend to reject all changes made by new editors, making it much harder for newcomers to get involved in Wikipedia's community.

By taking more factors into account, Wikipedia hopes ORES will be a more accurate watchdog and help the site flourish, as it tries to organise all the world's information in one place.

Wikipedia has more than 5 million English articles, an amazing achievement - but its more inaccurate edits often make the headlines.

More controversial articles, about divisive figures and topics like George W Bush, the Catholic Church and global warming are amongst the most-edited articles on the site, and are sometimes subject to vandalism and malicious editing.

More recently, one Australian music fan managed to blag his way backstage at a Peking Duk gig after editing the band's Wikipedia page to show that he was one of the bandmates' family members.

ORES could make Wikipedia more accurate, but it might make it harder to sneak into gigs.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in