by Gideon Lichfield edited by O Society October 10, 2019
“There are two kinds of propaganda,” wrote Aldous Huxley in 1958 in Brave New World Revisited, a retrospective on his famous novel:
“rational propaganda in favor of action that is consonant with the enlightened self-interest of those who make it and those to whom it is addressed…“
in other words, arguments couched in facts and logic
“…and non-rational propaganda not consonant with anybody’s enlightened self-interest, but is dictated by, and appeals to, passion.”
This latter kind, Huxley goest on
“avoids logical argument and seeks to influence its victims by the mere repetition of catchwords…“
Make America Great Again!
“…by the furious denunciation of foreign or domestic scapegoats…”
Lock her up! Lock her up!
“…and by cunningly associating the lowest passions with the highest ideals, so atrocities come to be perpetrated in the name of God and the most cynical kind of Realpolitik is treated as a matter of religious principle and patriotic duty.”
To Huxley’s readers, most of whom lived through the era of Hitler, Mussolini, and Stalin, these methods all would be familiar. But over time, it came to seem, at least in the West, as if his “rational propaganda”—still possibly misleading, but nonetheless rooted in the language of reason and fact and enlightened self-interest— won out as the primary form of political discourse.
And then 2016 happened. Voters chose Brexit and Donald Trump even though it is patently clear Brexit was not thought through and Trump is not fit to be president. “Expert,” a term for someone who specializes in facts, became a pejorative. And so, politicians came to lie with seeming impunity, no matter how blatantly or how often the press catches them doing so.
So what went wrong?
The Naïve Pursuit of Reason
The standard answer is many things went wrong. The global financial crisis. Jobs lost to globalization and automation. Widening inequality. Terrorist attacks. Refugees. And complacent, technocratic elites who speak self-assuredly of growth and progress while simultaneously they fail to notice how many people see little of either. The story goes this opened the way for unscrupulous populists to appeal to voters’ base emotions, thus make us ignore facts and logic normally obvious to anyone.
But this narrative, while not untrue, contains a weakness. It’s the assumption being able to appeal to voters’ base emotion is an anomaly, something to happen only at times of great national stress, and under “normal” circumstances (whatever these are), facts and reason can and do prevail.
In reality, maybe what’s anomalous is for facts and reason to have the upper hand.
The belief, or rather hope, humankind is ultimately rational grips Western politics at least since the time of Descartes, and inspires such 19th-century optimists as Thomas Jefferson and John Stuart Mill. “Where the press is free, and every man able to read, all is safe,” Jefferson famously wrote.
Yet in recent years, we learn much about the human mind to contradict the view of people as rationally self-interested decision-makers. Psychologists establish we form beliefs first and only then after look for evidence to back up these beliefs. Research turns up apparent physiological and psychological differences between liberals and conservatives, and finds evidence these differences are of ancient evolutionary origin. We identify the “backfire effect,” a.k.a. confirmation bias, in which people hew even more strongly to an existing belief when shown evidence to clearly contradict it. (What, one wonders, would seeing the internet do to Jefferson’s faith in a free press—kill it, or make it stronger?)
Other research looks at the habits of highly effective propagandists such as China, Russia, and alt-right icon Milo Yiannopoulos. The main takeaways: truth, rationality, consistency, and likability aren’t necessary for getting people to absorb your viewpoint. The things which do work: incessant repetition, distractions from the main issue, sidestepping counterarguments rather than refuting these, using “peripheral cues” to establish credibility or authority, and antagonizing people who dislike you in order to get the attention of people who might like you. Another favorite technique, this one perfected by the tobacco industry: strategically sowing doubt in something for which there’s overwhelming evidence.
Somehow, these methods must play into our heuristics—ancient cognitive shortcuts humans or our ancestors evolve as survival instincts, probably long before language and reason.
These also a perfect summary of the Trump playbook.
Liberals—and the liberal/ centrist-leaning mainstream media—fail to grasp this, argues George Lakoff, a Berkeley cognitive linguist. Instead, they keep trying to refute right-wing populists’ arguments, check their “facts,” and point out inconsistencies in their positions. This is why liberals and centrists were caught unawares by Brexit and Trump’s victory, and still flounder in dealing with him to this day.
Lakoff is best known for his work on how the metaphors we use influence our beliefs. He argues people tend to vote in line with our values rather than our rational logic and evidence. And h advances an interesting claim: right-wing folks are inherently better than left-wing folks at appealing to people’s values, because of the subjects, according to Lakoff, both tend to study in college:
“If you’re a conservative going into politics, there’s a good chance you’ll study cognitive science, which is how people really think and how to market things by advertising. So you’ll know people think using frames and metaphors and narratives and images and emotions and so on… Now, if instead you are a progressive… typically you’ll study political science, law, public policy, economic theory, and so on…
What you’ll learn in these courses instead is what is called Enlightenment reason, from 1650, from Descartes. And here’s what Enlightenment reasoning says: ‘…If we all think logically and we all use the same reasoning, if someone just tells people the facts, we should reason to the same correct conclusion. And this just isn’t true. And this keeps not being true, and hence liberals keep making the same mistake year after year after year.‘”
The 21st Century: Propaganda on Steroids
The dictators of the early 20th century know all about repetition, distraction, antagonism, and so on, even if they do not know the science behind these methods. This is why, as we see at the start, Huxley’s description of non-rational propaganda so neatly matches Trump’s verbal tics. As the historian Timothy Snyder observes in his recent book On Tyranny, Trump’s methods for undermining truth are very similar to ones identified by Victor Klemperer, a scholar and diarist who lives through Hitler’s Germany and the Soviet aftermath.
What changes since then is, of course, the internet, and the many new ways it creates for falsehoods to reach us. The power of populism today lies in its ability to combine extant 20th-century propaganda techniques with new 21st-century technology media, effectively pumping propaganda on steroids.
Here are the main ways it happens:
- Echo chambers. Social media and the explosion of choice in news sources exacerbates our tendency to clump into like-minded groups. We see far more messages to reinforce our beliefs than challenge these. This is because the platforms through which we find most of what we see online—social networks such as Facebook, Twitter, and Weibo, as well as search engines such as Google, Yandex, and Baidu—are business models requiring a maximization of the time we spend using these platforms.
- Alternative news sources. Whether it’s Breitbart for the alt-right or Pravda for the Kremlin, it’s now possible to create large, well-financed operations to pump out news with a strong agenda, which can reach people across the world. (Russian TV is particularly good at appealing to heuristic thinking, argues Maxim Alyukov, a Russian sociologist.) Just one person can become an alternative news source, as in look at Trump’s Twitter account. This proliferation of sources doesn’t just have the effect of overloading people with competing versions of the truth. These sources also can change our news cycle itself, determie what gets attention and what doesn’t, force other media to chase stories they might otherwise ignore, and neglect those we should pay attention to.
- Fake news. Even the most tendentious news sources tend to stop shy of fabricating outright falsehood out of whole cloth themselves, but some—such as the notorious Macedonian teenagers who spread made-up pro-Trump stories during the US election campaign—deal in little else. Whereas sites like Breitbart and HuffPost have political agendas, these essentially are commercial parasites of politics who create low-cost, sensational content to draw clicks and thus make money off ads. Yet by adding so much to the general level of distraction and confusion, these sites further undermine any unitary consensus on truth.
- Online swarms. If one has a fiercely loyal base of supporters (Trump, Clinton Yiannopolous) or can pay them (Russia, China), then one can mobilize vast groups of people to troll opponents, flood the digital airwaves with your desired message, amplify it, and make it hard to tell how much support it really has.
- Bots. Automated social media accounts also are pressed into service to both amplify messages and quash them. As technology improves, bots become ever harder to distinguish from real people.
- Psychological profiling and targeted advertising. In its by-now infamous ”emotional contagion” study of 2014, Facebook shows it’s possible to influence people’s moods in precise, predictable ways by putting certain words into the posts we see on Facebook. In an only slightly less controversial study, the company shows it can change people’s likelihood of voting. Companies such as Cambridge Analytica claim to be able to sway voters’ preferences en masse, using publicly available data skimmed from people’s social-media accounts to build detailed psychological profiles and craf messages for each person. “Weaponized AI propaganda,” as Scout.ai calls it.
It requires an obsession with propaganda to follow consistently how this combination of these technologies with old-fashioned spin and demagoguery change politics and public discourse.
Huxley, it must be remembered, foresaw none of this. He devotes much of Brave New World Revisited to assessing which of the futuristic methods of persuasion in Brave New World, written 26 years earlier, are on the way to coming true. He writes chapters on “brainwashing,” “chemical persuasion,” “subconscious persuasion,” and even mass hypnosis—prophecies which though valid, now seem somewhat quaint.
Yet one almost throwaway paragraph about the world of the 1950s rings stunningly true 60 years later (emphasis mine):
“In regard to propaganda, the early advocates [like Jefferson] of universal literacy and a free press envisage only two possibilities: the propaganda might be true, or it might be false. They do not foresee happens, above all in our Western capitalist democracies—the development of a vast mass communications industry, concerned in the main neither with the true nor the false, but with the unreal, the more or less totally irrelevant. In a word, they fail to take into account man’s almost infinite appetite for distractions.”
The central point of Brave New World is—contrary to what George Orwell would suggest 16 years later with 1984—governments do not need to be totalitarian to exercise social control. Rather our almost infinite appetite for bullshit can render a population politically helpless. Of all Huxley’s warnings, this may be the most prescient.