Opinion

Wikipedia is twenty. It’s time to start covering it better.

 

Last year, as the election approached, we heard that the founder of 8chan had become an active Wikipedia editor. As reporters covering Wikipedia, the story seemed irresistible: the creator of the fringe image board, known as the breeding ground for the QAnon conspiracy, against Wikipedia and its army of dedicated volunteer editors, who seek to ensure reliable information. 

But it turns out that disinformation being spread on 8chan and social media does not permeate Wikipedia, despite its porous format. It’s strict sourcing policy and volunteer editors work. Twitter and Facebook twisted themselves into pretzels trying to cope with a thinly sourced New York Post report about Joe Biden’s son, Hunter. Wikipedia’s editors merely contextualized its publication as part of a wider “conspiracy theory” relating to Biden being pushed out by Trump’s proxies. Reuters, WIRED, Vox, and others praised Wikipedia’s approach. 

This was not always the case. During the 2004 election and subsequent recount, Wikipedia made headlines for massive political battles over edits on the articles for John Kerry and George W. Bush. Coverage highlighted the online encyclopedia’s role in what the New York Times termed the “separate realities” within America.

Near the end of the Trump presidency, the paper was praising Wikipedia for its coronavirus pandemic coverage and reported positively on how the WHO had tapped the online platform to help battle the COVID-19 infodemic. Throughout the past year, other legacy media publications lauded Wikipedia for its response to political disinformation ahead of the election. Wikipedia’s community of dedicated editors and its array of complex policies, though far from perfect, now seems better poised to offer fact-checking than mainstream and social media.

In the first years of the site, the press enjoyed noting funny instances of Wikipedia vandalism. But, as the tone of the coverage shifts toward praise, and on the site’s 20th anniversary, we feel journalism should help readers better understand Wikipedia’s policies and inner workings—in other words, improve the general public’s Wikipedia literacy. We have identified two major themes that might help reporters in this effort.

It is not a free-for-all

Although it is true that Wikipedia is, broadly-speaking, an openly editable project, journalists who suggest that the encyclopedia itself is a free-for-all do a disservice to their readers. Over the years, the Wikipedia community has created a large number of mechanisms that regulate its market of ideas. Perhaps the most important one is the ability to lock articles for public editing. 

Anyone can edit Wikipedia, but temporarily disabling people from editing it anonymously can go an extremely long way in preventing disinformation. Articles such as the “COVID-19 pandemic” are subject to semi-protection, meaning that anonymous IP editing is not allowed and that any contributors must register an account. Other articles have more extensive protections, such as the article on Donald Trump, which has long been subject to extended-confirmed protection, meaning that only Wikipedia editors who have been active for 30 days and who have performed at least 500 edits can directly edit Trump’s page. 

Sign up for CJR's daily email

Changes can be suggested to these protected pages, but only an editor with the required seniority can actually introduce them to the text itself, serving as a fail-safe on contentious topics. The same minimum standard for experience has been applied by Wikipedia’s volunteer administrators to, for example, the articles on the 2020 United States presidential election, as well as president-elect Joe Biden. The protection mechanisms create a degree of personal accountability within the open system with little or limited cost to the overall freedom permitted by the project. 

In addition to the technical measures used to reduce misinformation, journalists should also remember that there are many volunteers who watch the highly-trafficked articles to review recent changes. Recently, the community started following which Wikipedia articles are shared the most on social media to prevent them from being abused. Vandalism and unsourced contributions are often not the work of dedicated members of the community and are many times removed from articles within a matter of seconds.

It’s complicated

Wikipedia’s diverse and often divergent community is the secret to its success. Some Wikipedia editors contribute to articles in a specific subject area, like WikiProject Medicine. Others address a specific pet peeve, like the Oxford comma. Still others dedicate their time to develop technologies like automated bots that root out and delete vandalism. Although some editors are highly social, such as those who attend the annual Wikimania user conference, others prefer digital anonymity and virtual interaction only. Most Wikipedia editors agree that the project is important and that building a free and reliable encyclopedia is a worthy goal. But that’s quite often the only thing that they agree on.

Wikipedia, in the singular, does not “decide” or “ban” anything; rather, the community, or different groups within it, reach a temporary consensus on certain issues. That’s understandably hard to pack within a headline. But journalism suggesting that Wikipedia is a monolithic agent with a single point of view simply misses the mark. 

In 2018 Wikipedia made headlines when Donna Strickland won a Nobel Prize for Chemistry and it was found that at the time of her award she did not have a Wikipedia page. Coverage focused on gender bias on Wikipedia, and noted that an earlier entry on Strickland, submitted before she had won the Nobel, had been rejected for not satisfying the project’s notability guideline for determining whether a topic merits its own article. 

A key determinant of notability is whether the subject has received significant coverage from reliable media sources. The volunteer Wikipedia editor who declined the draft page about Strickland did so because, according to the guideline, there wasn’t enough coverage of Strickland’s work in news articles and other independent secondary sources to establish her notability. Katherine Maher, executive director of the nonprofit Wikimedia Foundation, later wrote an op-ed for the Los Angeles Times headlined “Wikipedia Mirrors the World’s Gender Biases, It Doesn’t Cause Them.” Rather than cast the blame on Wikipedia or its policies, Maher challenged journalists to write more stories about notable women like Strickland so that volunteer Wikipedians had sufficient material to source in their own attempts to fix the bias. The media can do more than just call out biases on Wikipedia; it can also help address them. 

The need for nuance in reporting on Wikipedia is even more pressing as news outlets proactively demonize Wikipedia. After the community decided that Breitbart News was not a reliable source, the far-right outlet began to call out the site for its liberal bias. Russian state media has also focused on Wikipedia. A similar process is now taking place in India. 

The corrective is more stories that show Wikipedia operates within a larger information ecosystem and relies on the availability of trustworthy media coverage. More broadly, we need journalism that reveals how the act of collecting knowledge—and even the concept of knowledge itself—is complex. Such coverage, we hope, might inspire new editors to join the editorial fray. That, in turn, can help reduce the project’s biases and help further protect the site from disinformation. 

Stephen Harrison and Omer Benjakob are, respectively, an attorney and writer who writes the Source Notes column for Slate about Wikipedia and the information ecosystem; and the tech and cyber reporter and editor for Haaretz in English, and writes about Wikipedia and disinformation.