The Psychology of Deception
What you need to know to defend yourself against deception, disinformation, and manipulation in the digital era.
Have you ever shared misinformation or found out that something you believed to be true… actually wasn’t? I don’t know why I’m asking, because of course you have — if you’re online in 2025, then you’ve definitely shared misinformation, whether you realize it or not. When you spend enough time online, it’s essentially guaranteed that at some point or another you will end up sharing something that’s not true.
It’s not because we’ve all suddenly become stupid or uninformed. The reason that spreading misinformation is so common is that today, we are facing a type of disinformation that doesn’t spread through lack of knowledge. It doesn’t try to convince you logically, either. Rather, most disinformation spread today slides into your feed disguised as concern, justice, or outrage — content meant to trigger and exploit your emotions and attention at levels you’re not even aware of. However, it’s possible to teach yourself to start being aware of these often unconscious tendencies so you can better understand where your vulnerabilities hide and how bad actors may be exploiting them. Here’s how…
From Rational Persuasion to Psychological Exploitation
To understand how this works, you need to know this: most modern disinformation campaigns don't come from some secret psyops bulletin board or hushed meetings of the minds. Well, most of them, at least. In today’s world, disinformation campaigns are overwhelmingly likely to come from familiar names, and most borrow openly from the scientific literature in fields such as public relations, advertising, and psychology, as well as military doctrine in areas like influence and strategic communications.
Psychology research has long emphasized the importance of the unconscious — the brain activity that occurs outside of our conscious awareness and which Freud conceptualized as a reservoir of repressed feelings, thoughts, desires, and memories. The unconscious mind is responsible for “System 1” thinking, which describes the automatic and intuitive thought processes that our brain uses to allow us to make quick decisions and judgments based on things like patterns and prior experiences. These unseen and unspoken factors nevertheless have a huge influence on our behavior, making them a prime target for anyone interested in influencing people’s decisions and behaviors.
Some of the first modern attempts to influence people on a mass scale by tapping into their unconscious included subliminal messaging campaigns – such as the supposed effect of flashing messages about popcorn on a movie screen for a split second – and other efforts to use subliminal cues to guide behavior, as well as the use of therapeutic techniques and drugs to access unconscious thoughts, which later formed the basis of the first attempts at “mind control” and brainwashing. This work led to projects like the CIA’s controversial MK-ULTRA studies in which “participants” — often without consent, hence the scare-quotes — were given drugs and subject to inhumane experiments involving sleep deprivation and psychological manipulation.
The main takeaway from all of this research – both ethical and unethical – was that if you want to subtly influence perceptions and ultimately behaviors, you shouldn’t approach people with the goal of educating them or increasing their awareness of something, but rather, you should aim to tap into their unconscious mind to evoke emotions, and then use their emotional response to direct behavior.
This work fed directly into modern marketing tactics, which rely on the well-founded assumption that consumer behavior is shaped less by facts or features of the product, and more by feelings of belonging, identity, and status.
At the same time, in a parallel and related line of work, scholars were busy developing theories and frameworks of mass communication and media influence, such as media effects theory – a framework that describes the different functions of the news media in shaping public opinion and perceptions. According to this theory, the two major functions of media are agenda setting and framing, where agenda-setting tells people what to think about, and framing tells them how to think about it.
The combined outcomes of these lines of research formed the basis of modern propaganda and political campaigning.
The military understands this, too. For example, US Army Field Manual (FM) 3-05.30 explicitly advises psychological operations teams to “[express] information subjectively to change attitudes and behavior and to obtain compliance or non-interference” — a clear acknowledgment that facts alone are not enough to persuade people or win an argument, and that messaging must be specifically tailored to the psychological and/or social identity of the audience. NATO’s Strategic Communications guidance similarly frames effective messaging as a process of narrative alignment, not necessarily evidence dissemination.
That’s the foundation upon which modern disinformation was founded, and as you can see, it is based on solid science, not backroom discussions or secret plots.
The Psychology That Makes It Work
In practical terms, the result of this research — when put into practice in today’s world — is that emotional content is algorithmically amplified and neurologically prioritized.
Your brain, under constant stimulation by digital devices and millions of pieces of content competing for your attention, filters and ranks information using heuristics — shortcuts evolved to help us survive in low-information environments, but which are easily gamed in high-noise digital environments. If a message triggers anger or fear, your brain flags it as urgent; if it confirms your existing beliefs, your mind treats it as more likely to be true (confirmation bias); if others are sharing it, your confidence in it increases (normative bias; illusory truth effect). This pattern is observable in basically every viral disinformation event.
Consider the 2016 Pizzagate conspiracy theory, which claimed that high-level Democrats were running a child sex-trafficking ring out of a Washington, DC pizza shop. There was no evidence — yet the story spread widely across Facebook, Reddit, Twitter, YouTube, and a long list of fringe social media sites and web forums. It succeeded not because it made sense or was founded in reality, but because it was morally electrified. It triggered disgust, fear, and a sense of urgency — feelings that overpower rational skepticism in most instances. One man became so convinced by these viral posts that he showed up at the restaurant with an assault rifle ready to rescue imaginary children. The disinformation worked because it bypassed cognition and appealed to identity — i.e., the protector; the betrayed citizen; the defender of innocence.
Platforms Built for Propaganda
Another key element that you must understand in order to start seeing how this works is that social media platforms are not neutral stages. They are emotion-maximizing machines, fine-tuned to reward content that provokes rather than informs.
In 2021, internal Facebook research leaked, confirming what many researchers had already shown: that algorithmic systems amplify divisive, emotionally intense content — not because it’s accurate, but because it increases engagement. This means that content that makes us feel outrage, superiority, fear, or disgust is more likely to be seen, clicked, and shared, regardless of its truth value.
Take the example of the Plandemic video that went viral in 2020 by presenting a charismatic whistleblower accusing global elites of conspiring to profit from COVID-19. The video was professionally produced and followed a familiar structure used in conspiracy narratives — a brave underdog, a corrupt establishment, and a hidden truth.
The emotional structure, not the factual content, is what made it work. The video amassed millions of views in days despite being factually debunked within hours. The real mechanism wasn’t persuasion — it was performance, emotion, and identity-based signaling.
Participatory Propaganda
Many people still think about disinformation as something done to us by foreign agents. And yes, that happens — Russia’s Internet Research Agency (IRA), for example, ran massive campaigns during the 2016 US election aimed at deepening divisions and reducing voter turnout. But the real genius of today’s propaganda is that it doesn’t need a state sponsor. Or any sponsor at all. It is participatory by design.
Once seeded, emotionally resonant narratives are carried by everyday users. The IRA’s campaigns succeeded not because they were persuasive, but because they were emotionally tailored. Fake Facebook pages like “Blacktivist” and “Heart of Texas” didn't create new positions — they exaggerated existing identities to provoke us into doing the heavy lifting.
This model was even more devastating in Myanmar, where the military used Facebook to spread inflammatory falsehoods about the Rohingya minority Through fake celebrity accounts and nationalist pages, they seeded rumors and manipulated images accusing the Rohingya of plotting terror attacks. This disinformation campaign fueled mass hatred and played a critical role in normalizing violence that would eventually culminate in genocide.
How to Defend Yourself in a Post-Truth Environment
The uncomfortable truth is that knowledge alone won't protect you. Having higher levels of education can actually make you more vulnerable to cognitive traps like confirmation bias and overconfidence. The best way to defend yourself against today’s disinformation campaigns is not intelligence. It’s discipline.
To build resilience, you have to learn to track your own emotional responses. If something makes you feel a surge of fear, anger, or moral righteousness, pause and ask yourself: Who benefits from me feeling this way? What evidence am I actually reacting to? This is especially important when the disinformation sounds too good to be true, confirms something that you want to believe, or tells a story that you want to hear.
True resistance to manipulation doesn’t come from knowing facts, but recognizing when your mind is being hijacked — and knowing what to do to regain control.
Brilliant and cogent. Given the vast array of news and social media platforms eager to expand their following I am trying to imagine how to remediate this institutional commitment to misinformation. Nothing short of mass institutional instruction in the process of confirming before accepting information would be adequate. I am at a loss to see how such critical thinking would be universally embedded in school curricula.