Capitol attack
Credit: Tyler Merbler/Flickr

On January 6, Jason Moore was working from his home in Portland, Oregon and flipping between CNN and MSNBC as Donald Trump supporters gathered outside the U.S. Capitol. “Watching what was unfolding in D.C. on cable news, I found it initially fascinating, and then, later, terrifying,” he told me.

Moore, a digital strategist, is one of the top 55 contributors to the English-language version of Wikipedia. The free online encyclopedia has more than six million articles in English and is maintained by more than 100,000 regular volunteer editors like Moore. Around 1:30 p.m. eastern time, Moore started a new Wikipedia page to document what was then just a protest. He titled it: “January 2021 Donald Trump rally.”

“I have a personal interest just in documenting political movements,” said Moore, who goes by the username Another Believer. He logs onto his Wikipedia watchlist—a feed of the changes that have been made to the pages he wants to track—several times a day, like someone else might log on to Twitter or Facebook. “I’m a bit of a political junkie.”

As the Capitol protest escalated into a violent assault, Moore was tabbing between Google News, the Wikipedia article he had created, and the article’s “talk” page, where volunteer editors could discuss changes with one another. Hundreds more volunteer editors were chiming in. As chronicled by Alex Pasternack in Fast Company, Wikipedians debated the reliability of different sources and the accuracy of terms, and documented the democratic cataclysm in real time. It became, said Moore, “this hurricane of people sifting through a lot of information at once.”

Moore estimates he spent about ten hours editing the page now titled “2021 storming of the United States Capitol” and closely related pages. The entry runs nearly 13,000 words long and has hundreds of external source citations. It has sections on intelligence, or the lack thereof, leading up to the attack; on police preparations; on the participation of state lawmakers; on the House and Senate evacuations; on the completion of the electoral vote count; and more. More than 1,000 volunteer editors worked together on the entry, which is still being updated regularly.

The page is the result of a remarkably collaborative online community of volunteers who edit, verify, and generally obsess over the vast, always-in-motion encyclopedia. Wikipedia is not without faults; it doesn’t take much poking around to find a page with a major error. (Last year, a Reddit user unearthed that an American teenager who did not speak Scots, a Scottish dialect, had written almost half of the articles on Scots Wikipedia. The pages were riddled with grammar mistakes). Wikipedia is also not representative of the public; the vast majority of its volunteer editors are male, and fewer than 20 percent of Wikipedia’s biographies are about women.

But Wikipedia—one of the most visited websites in the U.S.—has avoided many pitfalls that have hobbled other online platforms. Twitter, Facebook, and YouTube are facing a backlash for their role in propagating misinformation. After Trump’s repeated false claims about election fraud propelled his followers to break into the Capitol, all three companies suspended his accounts. It might have been the right call in the moment, but it also raised uncomfortable questions about the outsize power over discourse wielded by a tiny number of executives at private companies. Wikipedia’s bottom-up model, shaped by thousands of volunteer editors, proves that there’s another way to build online communities. 

Other special volunteer roles help keep the site running. An arbitration committee, also made up of vetted, experienced editors, settles the most contentious disputes; “checkusers,” an elite group of Wikipedia editors, are granted access to technical data to figure out if several Wikipedia accounts are being operated by one person. These privileged editors help deal with difficult situations, but much of the day-to-day work of editing Wikipedia is handled by regular volunteers making changes, discussing issues, following the suggested dispute resolution process, and ideally, landing on a consensus. The site even has principles for how editors can best collaborate, dubbed “Wikiquette.”

As protestors at the Capitol turned violent, one major debate among Wikipedia editors was how to describe the event in the page’s title. Was it a protest? A riot? An insurrection? A coup attempt? “There is a clear consensus that protest is inadequate to describe these events,” wrote a Wiki editor with the username Matthias Winkelmann. “Riot is a more appropriate label for the events that took place,” responded a user called Bravetheif. “I oppose ‘protests’ and oppose ‘storming,’ but support ‘2021 United States Capitol Siege’ or ‘2021 United States Capitol Breach,’” wrote another editor calling themselves RobLa. On the morning of January 7, an editor with the username CaptainEek set the page title to “2021 storming of the United States Capitol.”

But the debate roared on, with editors making a case for their preferred term. Volunteers catalogued which terms different reputable publications had used. Their list of “generally reliable sources” that had used “coup” included the Atlantic, Buzzfeed News, and the Los Angeles Times. The list for “insurrection” included the Associated Press, Axios, and NPR.

This appeal to reputable sources springs from the ethos of Wikipedia content. According to English Wikipedia’s “Verifiability” policy, an editor can be sure something is true, but if it’s not verifiable with a reputable source, it shouldn’t be added to a page. The site has a chart of publications categorized by the current consensus view of their reliability. The consensus can and does change. In 2018, for example, Breitbart was “deprecated” by a consensus of editors, meaning it could no longer be cited as a reference for factual matters. A year prior, editors had made a similar decision about the Daily Mail, a British tabloid.

The imperative to provide reliable sources is one way Wikipedia editors keep misinformation off of contentious pages. When one user proposed an edit suggesting that the Capitol rioters were not really Trump supporters, but rather antifa, an editor with the username Anachronist responded, interrogating the sources provided for the proposed edit:

“Let’s examine those sources. A student newspaper (byu.edu) isn’t a reliable source. The Washington Times contradicts your proposal . . . explicitly saying that no Antifa supporters were identified. I could stop right there, but let’s go on: Fox News is not considered a reliable source for political reporting, and the Geller Report is basically a blog, self-published, and therefore not usable.”

The proposed edit never made it through, since administrators had placed the page under protection, meaning less experienced editors could not make changes directly to the page. That’s a common step for entries on contentious topics. By the evening of January 6, the “Storming” page was placed under “extended-confirmed protection,” meaning that for the next two days, only editors who had made over 500 edits and had had their account for 30 days or more could make changes. (After two days, the page was set to a slightly lower level of protection). “This helped enormously with the level of disruption,” said Molly White, a long-time Wiki editor and administrator, in an email.

White, a software developer in Cambridge, Massachusetts who goes by the username GorillaWarfare, made multiple edits to the “Capitol Storming” page. “I was horrified and anxious to watch this all unfold,” she explained, but editing on Wikipedia felt better than doomscrolling. “This is something I do often—if I’m trying to understand what’s happening or learn more about something, I will go edit the Wikipedia article about it as I do.” White primarily edits pages related to right-wing online extremism. She wrote much of the Wikipedia pages for Parler and Gab—alternative social media apps popular among Trump supporters and right-wing provocateurs—and contributed significantly to the entry on the Boogaloo movement.

Wikipedia can count on having humans in the loop on content decisions, rather than relying on artificial intelligence, because it’s much smaller than YouTube or Facebook in terms of active monthly users, said Brian Keegan, an assistant professor of information science at the University of Colorado Boulder. That’s helpful because content decisions often require understanding context, which algorithms don’t always get right. Humans can also offer more nuanced feedback on why an edit is being reversed, or why a page is being taken down.

Of course, Wikipedia doesn’t always get it right either. Less trafficked pages receive attention from fewer editors, which can easily result in significant factual errors. But pages that attract more attention from editors are often of high quality, thanks to a fairly functional system of collaboration and cross-checking. In fact, other social media companies have come to rely on Wikipedia as a source of reliable information. In 2018, YouTube announced it would link to Wikipedia pages alongside its videos about conspiracy theories in an effort to provide users with accurate information. In 2020, Facebook began testing Wikipedia-powered information boxes in its search results.

What Wikipedia illustrates is that the problems with Facebook, Twitter, YouTube, and other social media platforms aren’t that they are social or that they’re populated by user-generated content. It’s their business models. All three are for-profit companies that make their money through micro-targeted advertising, which means they have strong incentives to show users content that will keep them on their platform for as long as possible and keep them coming back. Content that confirms users’ beliefs or stokes their preexisting resentments can be good for business. That only overlaps with the truth some of the time.

As a nonprofit, Wikipedia operates within a fundamentally different set of incentives. It doesn’t rely on advertising revenue and it doesn’t need to drive up user engagement. The Wikipedia community has instead been able to develop norms and policies that prioritize the integrity of the content. “A platform like Wikipedia has no compunction about shutting down access to editing their articles, or stopping people from creating accounts—all these things that would really hurt topline numbers at shareholder-driven organizations,” said Keegan.

The irony of the “Capitol Storming” page is that so many volunteers worked so hard to accurately document an event fueled by lies. For every claim that the election had been stolen or Mike Pence had the power to stop the count, there was a volunteer clicking through news reports, trying to get it right. Nearly a month later, the page still isn’t complete. When I asked Molly White how she would know when to stop working on it, she wrote that Wikipedia is never finished, and pointed me to a corresponding Wiki entry titled “Wikipedia is a work in progress.”

Update: A reference to Fast Company’s article on the same Wikipedia page was added on Feb 8.

Our ideas can save democracy... But we need your help! Donate Now!

Grace Gedye is reporter for CalMatters. She was an editor at Washington Monthly from 2018 to 2021.