Heather Ford via nettime-l on Mon, 4 Aug 2025 07:58:27 +0200 (CEST) |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
<nettime> How to Study Wikipedia’s Neutrality – According to Wikipedia |
Hi folks, I'm new to this list but Geert suggested I post about Wikimedia's draft guidelines for researchers studying Wikipedia's neutrality. Below and at https://networkcultures.org/blog/2025/08/01/how-to-study-wikipedias-neutrality-according-to-wikipedia/. It's a difficult time for the Wikimedia Foundation right now but I still think that this does no favours to the organisation or the community it represents. best heather. https://hblog.org/ Professor, University of Technology Sydney On Gadigal Land How to Study Wikipedia’s Neutrality – According to Wikipedia By Heather Ford <https://networkcultures.org/blog/author/heatherford/>, August 1, 2025 at 10:42 am. A platform is telling researchers how to study its neutrality and defining what and where researchers should look to evaluate it. If it was Google or Facebook we might be shocked. But it’s from Wikipedia, and so this move will undoubtedly go unnoticed by most. On Thursday this week, the Wikimedia Foundation’s research team sent a note to the Wikimedia research mailing list asking for feedback on their “Guidance for NPOV Research on Wikipedia” [1]. The Wikimedia Foundation is the US-based non-profit organisation that hosts Wikipedia and its sister projects in the Wikimedia stable of websites. The move follows increased threats against the public perception of Wikipedia’s neutrality e.g. by Elon Musk who has accused it of bias and a “leftward drift”, sometimes referring to it as “Wokepedia” [2]. And threats to its core operating principles (e.g. that may require the WMF to collect ages or real names of editors) as governments around the world move to regulate online platforms [3]. The draft guidelines advise us on how we should study Wikipedia’s neutrality, including where we should look. The authors write that “Wikipedia’s definition of neutrality and its importance are not well understood within the research community.” In response, they tell us Neutral Point of View on Wikipedia doesn’t necessarily mean “neutral content” but rather “neutral editing”. They also argue that editing for NPOV on Wikipedia “does not aim to resolve controversy but to reflect it”. There is only one way to reflect a controversy, apparently, and that is the neutral way. In this, they seem to be arguing that researchers should evaluate Wikipedia’s neutrality according to its own definition of neutrality – a definition that absolves the site, its contributors and the organisation that hosts it from any responsibility for the (very powerful) representations it produces. The guidelines tell researchers what are the “most important” variables that shape neutrality on Wikipedia (and there we were thinking that which were the most important was an open research question). What is missing from this list is interesting… particularly the omission of the Wikimedia Foundation itself. In a separate section titled “The Role of the Wikimedia Foundation”, we are told that the Wikimedia Foundation “does not exercise day-to-day editorial control” of the project. The WMF is merely “a steward of Wikipedia, hosting technical infrastructure and supporting community self-governance.” As any researcher of social organisation will tell you, organisations that support knowledge production *always* shape what is represented – even when they aren’t doing the writing themselves. From my own perspective as someone who has studied Wikipedia for 15 years and supported Wikipedia as an activist in the years prior to this, I’ve seen the myriad ways in which the Foundation influences what is represented on Wikipedia. To give just a few examples: the WMF determines how money flows to its chapters and to research, deciding which gaps are filled through grants and which are exposed through research. It is the only real body that can do demographic research on Wikipedia editors – something it hasn’t done for years (probably because it is worried that the overwhelming dominance of white men from North America and Western Europe would not have changed). Understanding who actually edits Wikipedia could trigger changes that prioritise a greater diversity of editors. The WMF decides what actions (if any) it will take against the Big Tech companies that use its data contrary to license obligations. It decides when it will lobby governments to encourage or oppose legislation. Recognising that the WMF employees don’t edit Wikipedia articles doesn’t preclude an understanding that it plays a role in deciding how subjects are represented and how those representations circulate in the wider information ecosystem. Finally, the guidelines are also prescriptive in defining what researchers’ responsibilities are. Not surprisingly, our responsibilities are to the Wikipedia and Wikimedia community who “must” have research shared with them in order for research about Wikipedia’s neutrality to have impact. We are told to “Always share back with the Wikimedia research community” and are provided with a list of places, events and forums where we should tell editors about our research. In conclusion, we’re told that we must always “communicate in ways that strengthen Wikipedia”. “As a rule of thumb, we recommend that when communicating about your research you ask yourself the question “Will this communication make Wikipedia weaker or stronger?” Critiques are valued but ideally are paired with constructive recommendations, are replicable, leave space for feedback from Wikimedians, and do not overstate conclusions.” There is no room for those who think perhaps that Wikipedia is too dominant, that it is too close to Big Tech and American interests to play such an important role in stewarding public knowledge for all the world. Nor for those whose research aims to serve the public rather than Wikipedia editors, those of us who choose rather to educate the public when, how and why Wikipedia fails to live up to its promise of neutrality and the neutrality we have mistakenly come to expect from it. I know that this request for feedback from the WMF will not raise an eyebrow in public discourse about the project and that will be the sign that we have put too much expectation in Wikipedia’s perfection, perhaps because if Wikipedia is found wanting, if the “last best place on the internet” [4] has failed, then the whole project has failed. But for me, it is not a failure that Wikipedia is not neutral. The failure is in the dominance of an institution that is so emboldened by its supposed moral superiority that it can tell us – those who are tasked with holding this supposedly public resource – to account what the limits of that accounting should be. [1] https://meta.wikimedia.org/wiki/Research:Guidance_for_NPOV_Research_on_Wikipedia [2] https://www.newyorker.com/news/the-lede/elon-musk-also-has-a-problem-with-wikipedia [3] https://wikimediafoundation.org/news/2025/06/27/the-wikipedia-test/ <https://wikimediafoundation.org/news/2025/06/27/the-wikipedia-test/> [4] https://www.wired.com/story/wikipedia-online-encyclopedia-best-place-internet/ (legacy-tribute-revival posting of INC’s 2010 Critical Point of View <https://networkcultures.org/cpov/> network) -- # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: https://www.nettime.org # contact: nettime-l-owner@lists.nettime.org