Wikipedia is a free-access, free-content online encyclopedia supported and hosted by the non-profit Wikimedia Foundation. Any user who can access the site has the privileges to edit its content. If a user makes an edit to a particular content, it is reviewed by the Wikipedia staff and if it’s authentic, the changes would be made permanent for that content.
The Wikimedia Foundation this week released a service designed to improve the quality of Wikipedia articles. ORES or the Objective Revision Evaluation Service uses artificial intelligence (AI) and machine learning to help Wikipedia editors identify damaging articles more quickly and assign quality scores to them more rapidly. “Every day, Wikipedia is edited some 500,000 times”, Wikimedia said. “Editors, most of them volunteers, have to review all those changes”.
ORES allows editors to peer into incoming content to identify potentially damaging edits and quarantine them for future scrutiny. A damaging edit might include the insertion of personal opinion or obscenity into an article. "If you're in the media at all, there's a chance that someone is going to dislike something that you said and is going to try to damage your Wikipedia page," said Rob Enderle, principal analyst at the Enderle Group. "Low-level AI is really good at identifying patterns and taking prescribed action against the patterns it recognizes," he told TechNewsWorld. "Unless you have a ton more people than Wikipedia has, you'd never be able to keep up with the bad edits," Enderle said. With ORES, "Wikipedia can be more trusted and less likely to be used as a tool to harm somebody," he added.
ORES provides Wikipedia editors with a set of tools they can use to help them sort edits by the probability that they're damaging. "That allows that editor to review the most likely to be damaging edits first," said Wikimedia Senior Research Scientist Aaron Halfaker. "That can reduce the workload of reviewing edits by about 90 percent." ORES predicts the probability that an edit is damaging by drawing on knowledge it gains comparing before-and-after edits of all articles that appear in Wikipedia. It uses that knowledge to assign a score to a proposed edit. Scores can be retrieved quickly -- in 50 to 100 milliseconds -- and can be used to identify problematic edits rapidly. About 3 to 4 percent of the daily edits to Wikipedia articles are damaging”, Wikipedia said.
"Our machine learning model is good enough at sorting those edits by the probability that they're damaging that you would have to review 10 percent of incoming edits to know that you caught all of the damaging edits," Halfaker told TechNewsWorld. "Without this tool, you'd have to review all the edits to know you caught all the damaging edits." "One of the reasons we want to reduce the workload around quality control is so that editors can spend more of their time working on new article content rather than removing vandalism," Halfaker said.