Why everything feels like propaganda now
The attention economy strikes again
There’s a specific, unsettling sensation that hits whenever I open my phone lately, and I’m wondering if you’re feeling it too. I read a headline, a corporate press release, or even a public health update, and I feel a distinct disconnect between the words on the screen and the reality they claim to describe. It’s not necessarily that the information is false (often the facts are technically accurate) but the delivery feels sanitized, strategic, or curated to within an inch of its life. It feels like copy written by a committee of lawyers trying to minimize liability rather than humans trying to communicate a truth.
This is the ambient tension of the modern internet. It’s not that we believe everything is a lie, but rather that we have lost the ability to distinguish between a genuine statement and a public relations strategy. When we read a neutral fact and still find ourselves looking for the angle, we aren’t being paranoid. We’re reacting to an information environment where the texture of truth has been replaced by the polish of narrative.
We used to know what propaganda looked like. In the twentieth century, it was top-down, clumsy, and paternalistic. It was Uncle Sam pointing a finger at you, or a poster reminding you that “Loose Lips Sink Ships,” or a doctor on a television screen telling you which cigarette brand was smoothest on the throat. You could point to the source—the state, the corporation, the party—and identify the agenda. It was a blunt instrument used to shape public opinion from a central tower.

But the propaganda of today isn’t centralized, and that’s what makes it so disorienting. It’s networked, decentralized, and often entirely unintentional. It’s not being created by a single mastermind in a smoke-filled room, rather it’s being generated by millions of micro-incentives shaping the same message in the same direction at the same time. It’s the result of an ecosystem where every participant, from the New York Times to a random teenager on TikTok, is subconsciously optimizing their output for the same algorithmic rewards.
If we had to pinpoint when the levees broke, we’d probably look at 2016. That was the year the implicit agreement between the public and its institutions dissolved. Before then, most people operated under the assumption that institutions like the news media or the government were imperfect and perhaps biased, but fundamentally oriented toward recounting reality. After Brexit, the election, WikiLeaks, and the explosion of “fake news” discourse, that assumption evaporated.
We stopped viewing institutions as neutral referees and started viewing them as active combatants. Once you believe that every entity has a stake in the outcome, you stop reading for information and start reading for motive. Hannah Arendt famously warned that the result of a consistent and total substitution of lies for factual truth isn’t that the lie will now be accepted as truth and the truth be defamed as a lie, but that the sense by which we take our bearings in the real world is destroyed. That disorientation is exactly what we are feeling now.
The engine driving this disorientation is the algorithm. We have to remember that the platforms governing our communication do not care about truth; they care about engagement. And human psychology dictates that engagement is highest when content is simple, emotional, and reinforcing of existing beliefs.
The algorithm acts as a filter that strips away nuance and promotes certainty. If you write a complex, ambivalent post about a cultural issue, it dies in the feed. If you write a sharp, polarizing, emotionally charged post about the same issue, it goes viral. Over time, this trains everyone—journalists, politicians, and regular people—to speak in narrative templates because deviation is punished with irrelevance. We end up with a perceptually narrowed reality where everything looks like a political statement because the only things that survive the filter are political statements.
This leads us to the incentive problem. The reason nothing feels neutral is because, in the attention economy, nothing is neutral. Every message you encounter has a Key Performance Indicator attached to it. News outlets need clicks to survive. Politicians need outrage to drive fundraising. Corporations need brand safety to keep shareholders happy. Influencers need sponsorships to pay rent. Activists need virality to push their cause.
When every single participant in the public square is selling something, whether it’s a product, an ideology, or a personal brand, communication ceases to be about connection and becomes about conversion.
We have all become marketers, subconsciously polishing our own thoughts to ensure they land with the intended audience. And when everyone is optimizing for influence, everything starts to look like propaganda because, functionally, it is.
This optimization has created a bizarre aesthetic problem where we have all learned to write in what I’m calling “Propaganda Voice.” You know this tone. It’s the bloodless, therapeutic language of the corporate apology (”We hear you and we are listening”), or the boiler-plate empathy of the influencer confession video, or the identical phrasing used by every non-profit and university when any crisis breaks.
We have collectively adopted the voice of institutional correctness because it feels safe, like some sort of linguistic shield. But the result is that even when we are trying to be authentic, we sound like press releases. We use the same stock phrases and the same cadence, creating a flattening effect where even our genuine expressions all start to sound… the same.
Underneath the polished language, ideas are spreading like viruses. This is where Richard Dawkins’ concept of memetics becomes essential. In a high-speed information environment, simple memes outcompete complex truths because they replicate faster. A meme flattens complexity, encodes a set of values, and spreads through imitation rather than evaluation.
This is why complex events get reduced to slogans within hours. The slogan is easier to share because it quickly signals which tribe you belong to. It allows you to participate in the discourse without having to do the heavy lifting of actual analysis. “Everything feels like propaganda” because we are constantly swimming in a sea of these identity-signaling memes, where the primary goal of communication is to signal allegiance rather than to convey information.
I actually wrote a more in-depth analysis on meme-culture last August.
The psychological cost of living in this environment is a state of permanent hypervigilance. We have all become exhausted detectives, constantly asking the same set of questions: “What’s the angle here?” “Who benefits from this headline?” “Why are they telling me this now?”
Healthy skepticism has mutated into ambient paranoia.
When the informational environment becomes this polluted, people start treating all speech as strategy. We assume that every statement is a move on a chessboard, which makes genuine vulnerability or honest error impossible to perceive. We don’t just lose trust in the news, we lose trust in each other.
So how do you navigate a world where everything feels like a psyop? You can’t unplug entirely, but you can change how you process the signal (and noise!).
First, slow down your interpretation. The algorithm relies on your immediate emotional reaction. When you feel that spike of outrage or validation, pause and ask yourself if you are consuming information or if you are being fed a frame.
Second, look for incentives, not conspiracies. Most of the time, the distortion you see isn’t the result of a shadowy cabal, it’s the result of misaligned incentives. The writer wanted a click, the politician wanted a vote, the algorithm wanted a second of your time. Understanding the mechanism is less frightening than imagining a villain.
Third, pay attention to tone. If something sounds overly sanitized, it’s propably scripted. If it sounds excessively emotional, it’s likely manipulative. If it sounds aggressively neutral, chances are, it’s hiding something. Learn to listen for the messy, inconsistent sound of an actual human voice.
Finally, rebuild your private judgment. Our intuition atrophies when we outsource our thinking to pundits and platforms. Reclaim the right to be unsure, to have an opinion that doesn’t fit a template, and to trust your own eyes over the narrative being presented to you.
There’s a strange paradox at the bottom of all of this. Everything feels like propaganda now not necessarily because the world has become more deceptive, but because we have become more aware of the machinery. We have seen behind the curtain, and we can’t unsee the little man pretending to be the wizard.
That awareness can feel like a loss, but it's actually the opposite. You can't be fooled by a trick once you've seen how it works. The disillusionment is the beginning of clear sight.
The goal isn’t to escape propaganda, because as long as we live in a society, we will be surrounded by attempts to persuade us. The goal is to stop being programmable. Propaganda thrives when people stop thinking for themselves, but it dissolves the moment we remember how.












You are a genius, love your work. ♥️
Hi Steph.
Excellent article. The answer you propose is correct - because it is. I have an important challenge/question for you at the end of this comment. Skip to there if my little boring history lesson loses you attention.
Prior to roughly 1969 all propaganda was in print form created by, as you say, propagandists. Even newspapers of the times got into the act. President Lincoln had to shut down a daily newspaper for that reason during the Civil War. Freedom of the press be dawned.
In 1969 came TV nightly propaganda, oops, news. I've heard recently that Walter Cronkite was a CIA asset. Who knows. Doesn't matter.
As a college math/statistics major in the mid 70s I noticed something beyond curious...
At that time there existed only 3 broadcast TV networks. Each boasting GIGANTIC news reporting skyscraper buildings in NY and LA. So tall with their logo on top, these buildings housed many many hundreds of employees. Collectively they must have had 1000 reporters and news broadcast editors not counting the field reporters all over the civilized world
One evening I decided on an experiment. I would switch the rotary channel know on the TV every 1 or two minutes between the tree networks, ABC, NBC, CBS.
I repeated this experiment for four evenings in a row.
Guess what.
Each evening 30 minute world newscast had about 6 to 7 stories it covered. Here's what I found.
Story#1 - same one each. Usually the story that was on that day's newspaper headline. War, etc
Story#2 - same.
Story #3 - same but sometimes switched with #4
Story #4 - same but sometimes switched with #3 or #5
Story #5 - same on each other switched with #6 or #4 or #6
Story #6 - same or switched with above stories
Story#7 - same because this was the special interest segment. You know, the new Panda in a Chinese zoo or a puppy rescued from crocodiles in the Mississippi river. Shockingly all these 1000s of "independent" reporters found the same panda or puppy.
It blew my mind!
Statistically, with many thousands of reporters, many thousands of cities in the world, 100s of thousands of diplomats, press briefings, etc that day.... the TV mind control machine showed Americans the same 6 or 7 stories.
Yiu couldn't get a better propaganda maching. Oh wait...
When CNN was created as the 24hr news station based in Atlanta, I heard people from the TV broadcast industry complaining that there were more generals in uniform from nearby military headquarters walking around giving orders about the broadcast facility and its operations. Once on the airwaves and cable, CNN began, and concluded, buying exclusive TV rights into all major airports and commuting train stations. I believe schools too. I literally block my eyes and ears when I'm in an airport.
So, it has always been happening. Mass media only made it more effective as a 24x7 indoctrination.
Here's where I need your help...
I saw a post on Substack a few months ago citing a field of cognitive research about when people are constantly fed contradictory data. Day in and day out. I thing the branch of this study had its own name. I've not been able to find it.
My own non-scientific theory is that it literally "short-circuits" the brain's rational functionality. People can't rationally decide anything anymore. Do I pick the brightly colored major brand of rice croupier with 37 ingredients or the plain blue store brand box with only 4 ingredients?
Do I believe congresspersons in congress with a foreign flag draped around them of a country where five of them have children on boards of directors of energy companies in that country making $$$$$.
I truly have experienced the rational mind numbing effect of it all and I can believe it to be a tactic used by govts and corporations alike. Just one more P.R. tool in their basket.
If you ever find this area of research please let me know.
Oh, it also works to the benefit of individuals who are pathological liars. Stay far away from those folks.