I write about authoritarian leaders past and present. That means I’ve spent a lot of time studying professional liars. I’m frequently asked why people believe the falsehoods told by political figures and their media allies. Here I’ll look at how propaganda works and why people are often influenced by it. A second post, to be published next week, will discuss how to respond to lies and how we can push back against propaganda.
Propaganda gains traction through repetition and saturation. The same message is disseminated through multiple channels and institutions, with small variations, to lead a maximum number of individuals “in the same direction, but differently,” as the sociologist Jacques Ellul wrote.
In Fascist Italy, for example, a speech by Il Duce would be reproduced in print, seen on newsreels, and heard on the radio. Clips from the newsreels would go into documentaries, and quotes from the speech would end up in books (including textbooks), on posters along city streets, and on the facades of buildings like the Palazzo della Civiltà Italiana (commissioned for the never-held 1942 world’s fair).
Seeing the same messages over and over can lead some to tune out, but it may also boost confidence that the content is truthful. Repetition can lead to familiarity, which increases acceptance -- especially when the state has silenced alternate voices.
Today’s autocrats, many of whom hold elections and govern even with a pocket of opposition, are more dependent than ever on manipulating information about their own competency and the threats presented by their enemies. Social media helps them by accelerating processes of message repetition and multiplication. Followers share falsehoods and also make them “fresh” for their friends and contacts through the memes and slogans they create.
In Russia, Vladimir Putin doesn’t have the benefit of Communist-era total media control. That's why his 21st century propaganda playbook privileges the "firehose of falsehood” model. It floods the media space with noise and confusion, drowning out unwanted messages, such as refutations of Kremlin lies. “Quantity does have a quality all its own," write Christopher Paul and Miriam Matthews of the effects of this high-volume diffusion of falsehoods, partial truths, and conspiracy theories on a society’s resistance to propaganda.
The same can be said of the tsunami of disinformation unleashed by Donald Trump. The 5.9 million individual Facebook ads his 2016 campaign ran often purveyed the same messages with small variations. Falsehoods about the links between crime and immigration, a frequent theme, prepared his public to believe later Facebook ads that warned of the "national emergency" created by immigrant "invasions," (the word was used in 2,000 ads between January and August 2019 alone). Reinforced at his rallies, on talk radio, and by Fox News, they were part of the more than 30,000 false and misleading claims he made during his presidency.
Trump’s focus on immigration enabled another classic propaganda technique: telling lies that seem convincing because they build on a grain of truth, in this case the fact that large numbers of foreigners do indeed cross the border. Omitting vital information such as when they arrived and with what motives is part of the ruse. So, some lies are accepted because they seem to confirm what people think they know.
Often, people just want to believe the liar. Personality cults increase the leader’s credibility, since they present him as possessed of special powers or ruling with a divine mandate, making him seem infallible (the slogan “Mussolini is always right” says it all). Strongmen also know how to be persuasive, especially if they previously worked as journalists (Mussolini and the Congo’s Mobutu Sese Seko), in television (Italy’s Silvio Berlusconi and Trump) or were professional dissemblers (Putin was a KGB case officer). These practiced liars work hard to seem authentic --- just look at Narendra Modi’s Instagram performances.
Moreover, once people bond with the leader, they may be inclined to dismiss any evidence that conflicts with his claims, or overlook contradictions in his messages. They believe him because they believe in him. Or, in an interesting twist, they know he is lying, but they decide that they don’t care: better him than his enemy (who, as they have been taught to believe, lies even more). And some people actually approve of all the lying, seeing it as rule-breaking by a rogue they adore.
Hannah Arendt observed that Nazism and Communism made people less able over time to distinguish between fact and fiction, truth and falsehood. Today's illiberal leaders encourage a similar atrophying of critical skills, as do the social media platforms that so many use as a primary information source. Investing in media literacy is essential, but so is education about the damages of authoritarian models of power that turn leaders into infallible god-like figures--and lying into official state policy.
Jacques Ellul, Propaganda. The Formation of Men’s Attitudes (New York, 1973), 10.
Oliver Hahl, Minjae Kim, Ezra W. Zuckerman Sivan, "The Authentic Appeal of the Lying Demagogue," American Sociological Review 83, 1 (2018): 1-33.
Jennifer Mercieca, Demagogue for President. The Rhetorical Genius of Donald Trump (Austin, 2020).
Hannah Arendt, Origins of Totalitarianism (New York, 1958), 474.