Wikipedia now has an AI enforcer to filter through joke entries

1 Dec 2015

With so many entries made daily on Wikipedia and so much time needed to edit out the joke entries, the Wikimedia Foundation that runs it decided that it needed to build its own AI to filter out the junk.

As Siliconrepublic.com highlighted only recently, the numbers of edits on Wikipedia is somewhat staggering, with an average of 10 edits made every second and 20,000 articles created each month, so it’s no surprise that Wikipedia is struggling to keep up as a non-profit organisation.

In fact, the problem is becoming increasingly more difficult as, with the English language version at least, the number of volunteer editors has fallen by 40pc in the last eight years to just 30,000, according to MIT Technology Review.

This severe decline has actually been attributed to the somewhat harsh treatment handed down by senior editors to newcomers, who are eventually discouraged from contributing their editing efforts.

To compensate for the rapid decline in human editors, senior research scientist with the Wikimedia Foundation, Aaron Halfaker, is leading the development of an AI called the Objective Revision Evaluation Service, or ORES for short.

The eventual goal will use machine learning to fine-tune ORES to determine whether a mistaken edit was done simply by accident, or whether the person had deliberately entered in wrong or false information either for the sake of a joke, or to damage a person’s reputation.

Wikipedia edit Iraq War

An example of an edit made to the Iraq War entry of Wikipedia in 2005. Image via STML/Flickr

Helping ease the frustration of editing

Once it identifies a particularly glaring error, it will alert a human editor to make the final decision, as well as alert a new editor about their mistake to help them learn from it.

So far, ORES has been optimised to be used in English, Portuguese, Turkish, and Farsi, drawing on previous edits made in these languages on Wikipedia to give the AI algorithm an indication of what type of challenges it’s expected to face.

“I suspect the aggressive behaviour of Wikipedians doing quality control is because they’re making judgements really fast and they’re not encouraged to have a human interaction with the person,” Halfaker said. “This enables a tool to say, ‘If you’re going to revert this, maybe you should be careful and send the person who made the edit a message.’”

ORES will now be rolled out slowly on the website with editors being allowed to opt in to work with the AI program if they so choose.

“In some ways it’s weird to introduce AI and machine learning to a massive social thing, but I don’t see what we’re doing as any different to making other software changes to the site,” Halfaker continued. “Every change we make affects behaviour.”

Mini Wikipedia globe. Image: Lane Hartwell/Wikimedia Foundation (CC BY-SA 3.0)

Colm Gorey was a senior journalist with Silicon Republic

editorial@siliconrepublic.com