AI & RoboticsNews

Meta made a fact-checking AI to help verify Wikipedia citations

In 2020, the Wikipedia community was engulfed in scandal when it came out that a US teen had written 27,000 entries in a language they didn’t speak. The episode was a reminder that the online encyclopedia is not a perfect source of information. Sometimes people will attempt to edit Wikipedia entries out of malice, but frequently factual errors come from some well-intentioned individual making a mistake.

That’s a problem the Wikimedia Foundation recently partnered with Facebook parent company Meta to address. The two set their sights on citations. The problem with Wikipedia footnotes is that there are almost too many for the platform’s volunteer editors to verify. With the website growing by more than 17,000 articles every month, countless citations are incomplete, missing or just plain inaccurate.

Meta developed an AI model that can automatically scan citations at scale to verify their accuracy. It can also suggest alternative citations when it finds a poorly sourced passage. When Wikipedia’s human editors evaluate citations, they rely on common sense and experience. When an AI does the same work, it uses a Natural Language Understanding (NLU) transformation model that attempts to understand the various relationships of words and phrases within a sentence. Meta’s Sphere database, consisting of more than 134 million web pages, acts as the system’s knowledge index. As it goes about its job of checking the citations in an article, the model is designed to find a single source to verify every claim.

To illustrate the capabilities of the AI, Meta shared an example of an incomplete citation the model found on the Wikipedia page for the Blackfoot Confederacy. Under the Notable Blackfoot people section, the article mentions Joe Hipp, the first Native American to compete for the WBA World Heavyweight title. The linked website doesn’t mention Hipp or boxing. Searching the Sphere database, the model found a more suitable citation in a 2015 article from the Great Falls Tribune. Here’s the passage the model flagged:

In 1989 at the twilight of his career, [Marvin] Camel fought Joe Hipp of the Blackfeet Nation. Hipp, who became the first Native American to challenge for the world heavyweight championship, said the fight was one of the weirdest of his career.

What’s notable about the above passage is that it doesn’t explicitly mention boxing. Meta’s model found a suitable reference thanks to its natural language capabilities. The tool could one day help with Facebook’s misinformation problems. “More generally, we hope that our work can be used to assist fact-checking efforts and increase the general trustworthiness of information online,“ the model’s creators said. In the meantime, Meta hopes to build a platform Wikipedia editors can use to verify and correct footnotes systematically.


Author: I. Bonifacic
Source: Engadget

Related posts
AI & RoboticsNews

Cohere just made it way easier for companies to create their own AI language models

AI & RoboticsNews

How GPT-4o defends your identity against AI-generated deepfakes

AI & RoboticsNews

Microsoft leans harder into AI, updating Copilot, Bing, and Windows

CryptoNews

Ripple Boosts Cross-Border Transactions in Brazil With New Payment Solution

Sign up for our Newsletter and
stay informed!