Wikipedia turns 25: The quiet death of neutrality
“Wikipedia is badly biased. The days of Wikipedia’s robust commitment to neutrality are long gone.”
That’s not a quote by some right-wing pundit. That’s Larry Sanger—Wikipedia’s co-founder—the man who helped shape the site’s Neutral Point of View policy1 from the beginning. He’s been saying versions of this since 2010, and by 2020, he’d declared the principle he created effectively dead.
When the architect says the building is collapsing, you might want to pay attention.
Twenty-five years ago today, Wikipedia launched with a simple, almost utopian idea: knowledge should be open, editable, and shared. At the time, this felt like a clean break from the past. Encyclopedias were expensive, slow, and controlled by credentialed gatekeepers who were often wrong and rarely accountable. Wikipedia flattened that hierarchy overnight. Anyone could contribute, anyone could correct, and anyone could participate in the construction of public knowledge.
For a while, it genuinely worked because the internet was still small enough that curiosity outpaced ideology. Early contributors were motivated by accuracy, reputation, and the quiet satisfaction of getting something right. Bias existed, but it wasn’t yet consolidated. Disagreements were real disagreements, not proxy battles for broader narratives. Truth wasn’t pristine, but it was contestable, and that mattered.
What eventually broke wasn’t the premise of democratization. It was the assumption embedded inside it: that openness, once achieved, would naturally trend toward truth.
That assumption only holds at small scale. Once a system grows large enough, the question shifts from who gets to speak to who has the stamina to keep speaking. And the answer is almost never "curious generalists."
Wikipedia didn’t become ideologically skewed through some dramatic takeover or obvious act of censorship. It changed through something far more mundane and reliable: process.
Open systems don’t get captured by the loudest voices, they get captured by the most patient ones. People with time, ideological motivation, and fluency in procedure slowly outlast everyone else. Talk pages turn from debate into endurance tests, editing becomes attrition rather than collaboration, and rules designed to prevent bias inevitably evolve into tools for enforcing it.
This is where neutrality began to hollow out. Wikipedia’s famous “Neutral Point of View” policy didn’t disappear overnight. It drifted. Over time, neutrality stopped meaning a good-faith attempt to describe reality and started meaning alignment with institutional consensus. “Reliable sources” narrowed, quietly, to legacy media, academic journals, and NGO-aligned publications.
The effect wasn’t outright falsehood, it was framing. Certain figures became “controversial” by default, and certain positions were treated as baseline reality. Moral weight slipped into descriptive language. Nothing tripped a factual alarm, yet everything pointed in a direction.
That’s why the bias is so widely felt and so difficult to prove. The problem isn’t that Wikipedia lies, it’s that it preloads interpretation. One worldview gets to define what counts as neutral, while the other is perpetually accused of politicizing the page. At that point, disagreement isn’t about evidence anymore, it’s about legitimacy. And legitimacy, once proceduralized, becomes self-reinforcing.
In 2012, Pulitzer Prize-winning novelist Philip Roth discovered that Wikipedia’s entry for his novel The Human Stain contained a significant error. It claimed the book was “allegedly inspired by the life of the writer Anatole Broyard.”
This was false. Roth knew this with certainty because he wrote the book. The actual inspiration was his late friend Melvin Tumin, a sociology professor at Princeton. Roth had never even had a meal with Broyard.
So Roth did what seemed logical. He contacted Wikipedia to correct the mistake.
Wikipedia’s response: “I understand your point that the author is the greatest authority on their own work, but we require secondary sources.”
Read that again. The author of the novel was told he wasn’t a reliable source on his own novel!
Roth eventually had to publish a 2,600-word open letter2 in The New Yorker explaining the true origin of his book—just to generate the “secondary source” Wikipedia required to accept his correction.
This isn’t some gripe with one stubborn editor, or a defense of a specific author. I’ve got no skin in the game in regards to Philip Roth. This is just what happens when process becomes more important than truth. Wikipedia’s sourcing rules exist for good reasons, but when the rules produce outcomes this absurd—when the creator of a work is less authoritative than a journalist guessing about the creator’s intentions—you’re no longer serving truth. You’re serving procedure.
And procedure, left unchecked, always calcifies into ideology.
The mistake is thinking this is a Wikipedia-specific failure, but it isn’t. Wikipedia is simply the clearest example because it documents its own internal logic so transparently. You can watch capture happen in slow motion if you know what to look for.
The deeper lesson is more general and more uncomfortable. Truth cannot be crowdsourced indefinitely once the crowd professionalizes. This isn’t to say that people are malicious, but the reality is, the incentives shift. Curiosity doesn’t scale. Ideology does. And process always outlives principle.
Sanger argued as far back as 2010 that the people who work the most on Wikipedia tend to be “really comfortable with the most radically egalitarian views”—not because of a coordinated plot or conspiracy, but because that’s who stayed. Everyone else got exhausted and left.
The same dynamic plays out everywhere open systems mature. The early internet was chaotic but exploratory, now it’s organized but captured. The question isn’t whether bias exists, it’s whether the structure still allows bias to be challenged—or whether the challengers have been procedurally defined out of legitimacy.
It’s not as simple as telling you to “think critically.” That’s useless advice. Everyone thinks they’re already doing it.
The actual answer is more specific: you need better reading habits.
Wikipedia can still be useful, but you have to treat it like a starting point, not a verdict. It’s not “truth,” it’s “what survived the edit war.” Those are two different things.
Here’s how to actually use it:
1. Read the structure, not just the sentences.
Before you absorb the framing, look at the page’s bones:
What gets its own section vs. what gets a throwaway sentence?
What’s described as a fact vs. what’s described as a claim?
Who gets a clean biography and who gets a “Controversies” banner as a permanent identity tag?
This is where the bias lives—in placement, not just wording.
2. Check what “neutral” actually means on that page.
Wikipedia’s neutrality isn’t neutrality in the philosophical sense. It’s neutrality in the bureaucratic sense: alignment with approved sourcing norms.
So ask:
What sources are allowed to count as “reliable” here?
Are primary sources used, or only interpretations of primary sources?
Is disagreement treated like “there are two valid arguments,” or treated like “some people believe…”?
That “some people believe” construction is one of the oldest rhetorical tricks in history. It’s how you make a view sound unserious without actually arguing against it.
3. Scroll straight to the citations.
Don’t just read what the page says, read what it’s standing on.
A good rule of thumb:
If the claim is strong, the sourcing should be equally strong.
If the claim is vague, the sourcing usually sounds like vibes.
If all the citations come from the same ideological ecosystem citing itself, that’s not verification, that’s a citation loop.
4. Use first principles on the “why.”
When you see a page that feels off, don’t argue about the words. Instead, reverse-engineer the incentives.
Ask:
Who benefits from this being framed this way?
What outcome does this wording gently push you toward?
If this were written by someone with opposite priors, what would change?
Your job isn’t to “pick a side,” it’s to spot the hidden assumptions and decide whether they deserve to be yours.
5. Treat Wikipedia like a cultural artifact.
Used correctly, Wikipedia becomes almost more interesting than a neutral encyclopedia.
It tells you:
Which narratives have stabilized
Which topics are guarded
Where legitimacy has been procedurally locked in
Where language has been moralized without anyone ever announcing it
That’s not worthless information, it’s just not the information most people think they’re consuming.
Quick reference: what to look for —
Green Flags:
Specific facts, clear timelines, primary documents linked, calm language
Disputes shown as disputes (not as pathology)
Multiple credible sources across different outlets and perspectives
Red Flags:
“Critics say” or “has been accused of” with no substance
Moral adjectives smuggled into “neutral” description
“Reliable sources” meaning the same three outlets forever
Massive imbalance of detail—one side gets nuance, the other gets caricature
“Controversy” sections that function as permanent scarlet letters
The real lesson after 25 years.
Wikipedia didn’t teach us what to think, it taught us how easily thinking gets outsourced once systems grow large enough.
Neutrality has a maintenance cost. It requires constant renewal, people willing to be bored, people willing to be disagreed with, and people who value accuracy over moral signaling. When that maintenance stops, neutrality doesn’t vanish, it gets replaced. And whatever replaces it will insist, very sincerely, that it’s still neutral.
In an era where access is infinite, judgment—not information—is the scarce resource. And judgment, unlike platforms or processes, can’t be delegated without consequence.
We’re living in a moment where AI can generate entire articles, deepfakes are getting harder to spot, and half the internet is optimized to make you feel something rather than know something. The information environment is only getting weirder from here.
I’m not here to tell you what websites to use or avoid. I use Wikipedia myself. It’s fast, comprehensive, and for anything that isn’t culturally contested, it’s usually fine.
All I’m saying is, know what you’re reading. Know that “neutral” doesn’t mean neutral. Know that the people who control the process control the product.
Use your best judgment… that’s all I’m saying.
PS: If you enjoyed this piece, I recommend “Intellectual rigidity is making you boring as fuck” or “If no one’s told you, you’re allowed to be many things” next.
Trust me, the irony is not lost on me that I’m providing a Wikipedia link to their neutrality policy, but this is straight from the horse’s mouth: Neutral Point of View












