Faking news: can Wikipedia be the answer to a post-truth internet? Travers Lewis/Alamy

How Wikipedia gets to define what’s true online

The online encyclopedia, once controversial and untrusted, now helps to anchor our shared reality
March 3, 2022

“Before Brexit, the UK sent £350m per week to the EU.” “Ivermectin is an effective cure for Covid-19.” “Donald Trump won the 2020 US presidential election.”

Those three statements are verifiably false. But they’ve gained great currency with constituencies around the world, leading to a form of political engagement where what’s at stake is not the interpretation and implications of facts but the facts themselves. Political dialogue, always difficult, becomes impossible when those involved with the debate can’t agree on a shared reality to argue about.

Trump, whose tendency to declare any information he dislikes “fake news,” is now further polluting the term “truth.” His new social network—Truth Social—calls the posts its users write online “truths,” inviting some truly disturbing linguistic innovations. Users of his network may soon “retruth” each other, an Orwellian construction that neatly summarises discourse in a world where truth is what we loudly proclaim it to be.

In her brilliant new book Writing the Revolution: Wikipedia and the Survival of Facts in the Digital Age, the South African media scholar Heather Ford sheds light on a key question: who gets to define what’s true online? Ford’s answer is descriptive rather than philosophical. In practical terms, truth is what Google’s knowledge graph—the massive database of facts that allows the powerful search engine to answer most questions—can deliver to its users. Google’s knowledge graph is descended primarily from Wikipedia and Wikidata, an open-source collection of facts derived from Wikipedia, the remarkable participatory encyclopedia that, in the past 20 years, has become a core part of our collective knowledge infrastructure.

From a historical perspective this is unbelievably weird. In 2005, Wikipedia gained notoriety when an anonymous user edited the biography of journalist John Seigenthaler, accusing him of having been a suspect in the assassinations of president John F Kennedy and his brother Robert Kennedy. Seigenthaler, who was a friend and aide to Robert, was understandably incensed by the misrepresentation and took to the op-ed page of USA Today to complain. Wikipedia’s pseudonymity, openness and mutability all became reasons why some teachers, librarians and scholars taught students not to rely on it as a primary resource.

Fast forward to 2018. Faced with controversy about conspiracy theory videos gaining traction on her platform, YouTube CEO Susan Wojcicki outlined a plan to contextualise controversial videos with fact checks from Wikipedia. Rather than take on the fraught and time-consuming work of evaluating whether there exists an Antarctic ice wall at the planet’s “edge,” a claim made by flat Earth theorists, YouTube would link to Wikipedia’s simple, declarative article, which begins: “the flat Earth model is an archaic and scientifically disproven conception of Earth’s shape as a plane or disk.” Wojcicki hadn’t thought to ask Wikipedia about its involvement in the project, and Wikipedia, while conceding that Wojcicki was free to use its content, said it sure would be nice if YouTube supported the millions of users who donate their time and talent to build the encyclopedia. Suitably chastened, YouTube elected to license fact-checking content from a smaller and lesser-known project, Encyclopædia Britannica.

Too often Wikipedia is treated as an anomaly, an enterprise that somehow works in practice but not in theory 

In other words, over the course of 13 years, Wikipedia went from being an untrusted and controversial source to being the technology industry’s solution to the problem of warring realities. Ford suggests that most Wikipedians would point to a constellation of practices that, working at scale, have turned Wikipedia into something consistently accurate and believable. Contributions to articles must be supported with verifiable references—original research is not permitted—which means Wikipedia depends heavily on global news organisations to cover contemporary topics. It is organised around the principle of neutral point of view: articles should represent “fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic.” Neutral point of view is enforced by the fact that dozens or hundreds of editors might work on the same article. An unsupported assertion by one editor is likely to be removed or toned down by a subsequent editor. Somehow, verifiability and neutral point of view work together to gradually produce articles that reflect consensus reality.

Nonsense, argues Ford.

The formation of truth on Wikipedia is as political as it is anywhere else in the world. Her book centres on the creation of a single Wikipedia article about the Tahrir Square protests that ultimately ousted Egyptian president Hosni Mubarak in 2011. By following the editing of this single article, Ford documents the tension between activists who want to recognise and celebrate history in the making and those who argue that “Wikipedia is not a crystal ball” and should be slow and cautious in writing history. Ford’s central character is a Wikipedian, using the name “Egyptian Liberal,” who creates an article on the protests before they’ve actually begun in the streets, and is instrumental in lobbying other users to promote and expand it. The article changes name from “The 2011 protests” to “The Egyptian revolution”—Ford says this highlights how Wikipedia helps the global press articulate a narrative about ordinary citizens toppling a government.

Wikipedia is a roadmap for co-operation and collaboration at scale. As we mourn the apparent impossibility of keeping YouTube free of flat Earthers or Facebook free from vaccine disinformation, the fact that Wikipedia remains an anchor for consensus reality seems worthy of close study. Too often Wikipedia is treated as an anomaly, an enterprise that somehow works in practice even though it cannot possibly work in theory. When seen as magical, Wikipedia becomes impossible to replicate. When seen as explicable, perhaps it could become something to emulate and reproduce elsewhere.

A new community is advancing a vision for the internet in which everything is for sale and where the collective decisions of markets lead to popularity, if not always to truth. We will talk about the hope and hype of one vision for the future of the internet—Web3—in the coming months. Yet it seems deeply peculiar that a wave of would-be visionaries are so ready to bet on blockchain technologies and the wisdom of markets when we understand so little about a project like Wikipedia, which demonstrates that loosely co-ordinated volunteers can consistently solve one of the hardest problems of our time: finding consensus reality in an increasingly disputed world.