Lies, bullsh*t and knowledge resistance: A spotter’s guide

Amid growing interest in the ‘science of irrationality’, The Irish Times provides an A-Z of common intellectual traps

It is possible Donald Trump is engaging in sincere deception when says he is unaware of Russia’s role in helping him to be elected. Possible but not very likely. Photograph: Reuters

If the combined rise of Donald Trump and Boris Johnson has done any good it has been to reignite interest in epistemology. A micro-industry of books, research papers and expertise has grown up around the "science of irrationality" and the nature of truth statements.

The latest slew of publications includes Knowledge Resistance: How We Avoid Insight from Others by sociologist Mikael Klintman; Irrationality: A History of the Dark Side of Reason by Justin EH Smith; and Vices of the Mind by prominent political philosopher Quassim Cassam. They all contribute to a field of study popularised in recent times by authors like Daniel Kahneman and Jonathan Haidt – the study of why humans think and behave in an unreasoned, or downright stupid, fashion.

A consensus among these analysts – who hail from behavioural psychology, evolutionary science and philosophy – is that we need to understand before we condemn. “People haven’t evolved to be primarily truth-seekers,” says Klintman. “Rather, humans have evolved successfully since they generally prioritise their social motivation if this deviates from the quest for the most logical arguments and valid knowledge.”

Irrationality is part of what makes us human. Telling lies to ourselves has been central to our collective survival strategy.

READ MORE

A hasty conclusion which might be drawn from this is that truth is merely an expression of power; that reasoning is a game of manipulation. This is what Vladimir Putin and his agents of disinformation would have you believe. It’s what underpins the authoritarian argument that people are too stupid to be given the vote.

But Klintman, Cassam et al say there’s a different lesson to be learned. We are all subject to certain intellectual biases, and we can help to manage them for the betterment of humanity. Some of these biases, or cognitive traps, are more within our control than others, and the first step towards getting a handle on them is by naming or identifying them – even if, as Smith says, attempts to completely eradicate irrationality are “themselves supremely irrational”.

Drawing on the latest research from this expanding field, here follows an A-Z of mental foibles – a spotter’s guide to lies, bullsh*t and knowledge resistance. How many of these traps have you fallen into?

Argumentative impulse: A popular hypothesis to explain human irrationality is the "argumentative theory". When it was published in 2010 by cognitive scientists Hugo Mercier and Dan Sperber, Haidt said it was "so important" that abstracts should be posted in every university psychology department. It decrees that human reasoning evolved for a purpose, and that purpose was not to ascertain the truth but rather to win arguments.

The argumentative theory helps to explain why people can be so stubborn and proud in discussions. Importantly, it is only a theory and it doesn’t imply that all reasoning is, or should be, competitive. What it does suggest, however, is that if you want to get someone like Boris Johnson on side let him think he has won.

Bandwagon effect: A fancy name for groupthink or mob rule.

Blame-shifting: A product of the unwarranted high regard in which we tend to hold ourselves (see "self-inflation", below). The evolutionary biologist Robert Trivers cites the case a man who ran his car into a pole who told police afterwards: "The telephone pole was approaching. I was attempting to swerve out of the way, when it struck my front end." It's easier to blame the pole than to admit you were driving badly.

Bullsh*tting: Bullsh*t "is a greater enemy of the truth than lies are", writes Harry G Frankfurt in his now classic essay on the subject. Why? Because the bullsh*tter doesn't have any regard for the truth but only wishes to persuade.

How do you spot a bullsh*tter? Find someone who has an opinion on everything.

As Frankfurt notes, the production of bullsh*t “is stimulated whenever a person’s obligations or opportunities to speak about some topic exceed his knowledge of the facts that are relevant to that topic”.

Cognitive dissonance: The mental discomfort you feel when you realise you may be wrong. This is not so much a problem as a warning sign that people tend to ignore. When you read about how animals are mass-produced for food, for example, it may sit uneasily with your enjoyment of a chicken curry.

Like all pain, cognitive dissonance is something humans will strive to avoid (eg by lying to themselves about how meat is produced). An insidious ploy is to downplay or disregard knowledge that would mean admitting a mistake.

If you bought a Huawei phone, for example, you may find yourself dismissing suggestions that China is using such technology to spy on us. To think otherwise would be to accept that the €700 you’d just forked out for the latest P30 Pro was a bad investment.

Confirmation bias: A tendency to interpret information in a way that reinforces one's perceptions. There is no cure but, with Kahneman calls "a considerable degree of effort", such unconscious bias can be managed and the worst excesses tamed.

Epistemic insouciance: Defined by Cassam as "a particular form of not giving a shit. It means not caring about the facts, about what the evidence shows, or what experts think".

Hardcore relativism: The idea that all knowledge claims are equally arbitrary. Klintman distinguishes it from softcore relativism or what he calls "sound scepticism", the scientific demand for claims to be tested and information doubted in the absence of proof.

Question substitution: Unconsciously replacing a difficult or complex question in your mind with one that's easier to answer. How many of those voting on whether the United Kingdom should exit the European Union so it could develop new economic and political ties with its nearest neighbours believed they were simply being asked if Britain should "take back control"?

Inadvertent ignorance: Defined as ignorance which persists even if people act rationally. Think of how during a home renovation you won't know exactly what's involved until you pull up the floorboards.

The term is used by economists to remind us that there’s a type of ignorance we can’t avoid – similar to Donald Rumsfeld’s “unknown unknowns” – but that doesn’t necessarily justify massive cost overruns on the national children’s hospital.

Information overload: An inability to make sense of the torrents of information coming your way. Always a problem but today deliberately engendered in populations by Russian and Chinese troll armies and other online purveyors of distrust.

Intelligent resistance: Studies show high intelligence makes people more effective knowledge resistors. This "intelligence paradox" is linked to the argumentative theory: higher IQ means a greater ability to convince others you're right.

The phenomenon may help to explain why some of the most vocal opponents to reforming the junior and senior cycle in Irish education come from highly-educated quarters. That combined with confirmation bias. After all, since the Leaving Cert “did you no harm” why would you wish to change it?

Introspection illusion: Overestimating your own ability to identify your unconscious biases compared to others, producing what psychologist Emily Pronin calls "bias blind spot".

Loyal thinking: Resistance to outside influence helps to form and strengthen groups. This survival tactic evolved in humans over millions of years so we are unlikely to shake it off soon. "The whole meaning of loyalty is that you stick with a person or group even if they are incorrect or… full of other flaws," says Klintman.

Narrative bias: Our storytelling culture makes it easier for us to accept knowledge packaged in terms of good and bad, or black and white. What psychologists call "reframing" – creating a new way of looking at the situation – can help.

Negativity instinct: Our brains are more responsive to bad news than good. This phenomenon, much publicised by academic authors Hans Rosling and Steven Pinker, probably helped to keep our ancestors alive but it gives us a distorted picture of reality.

Power play: Humans desire power, and in trying to gain more of the stuff inconvenient truths must be ignored. "Should knowledge be produced that power is not interested in then power will fight it," says Danish economic geographer Bent Flyvbjerg.

Promiscuous teleology: The tendency to think everything exists for a purpose. Something again linked to the evolution of the human mind, although the likes of Aristotle and the Catholic Church may have played a role in hammering the message home.

Running away: Klintman defines this as "to resist producing or gaining knowledge due to worries that the knowledge would have negative consequences in society".

The Swedish professor has some sympathy for the impulse. The world might be a safer place, for example, if scientists never split the atom? But ultimately he comes down on the side of interrogating knowledge rather than running away from it. And that goes, incidentally, for research on the role of genetics and biology in gender difference, something which has occasionally sent his students into flight mode, he says.

Self-inflation: Like birds who puff up their feathers to impress, humans think much better of themselves than is warranted. People routinely put themselves in the top of positive distributions, for example, claiming to be more moral than they are proven to be under observation.

Sincere deception: A type of self-deception where you're unaware of the extent to which you've fooled yourself. The Economist ran a front page image of George W Bush and Tony Blair a year after the 2003 invasion of Iraq labelling them "sincere deceivers", a view supported by public inquiries in both the US and the UK (both leaders were said to have acted on false information but not to have lied).

It is possible Donald Trump is engaging in sincere deception when says he is unaware of Russia’s role in helping him to be elected. Possible but not very likely.

Structural amnesia: A term coined by British anthropologist Mary Douglas having observed the drive within communities to forget rival views. According to Douglas, "Certain things always need to be forgotten for any cognitive system to work."

Strategic ignorance: Making a mental calculation that it's better not to know something than to know it. People wary of receiving bad news from a doctor are vulnerable to it.

Witness also Boris Johnson’s admission during his prime ministerial campaign that he did not know what was in paragraph 5c of the GATT trade agreement, having just declared he would deliver Brexit along the lines of paragraph 5b. Knowing more about a subject invariably means having to move away from absolute certainty.

Virtuous stupidity: Not to be confused with common stupidity, this is the quality of being brave enough to admit a question may be more complicated than you previously thought. Socrates is the oft-quoted exemplar with his admission: "All I know is that I know nothing."

Weird-ness: The unconscious assumption that the world is universally Weird (western, educated, industrialised, rich, and democratic). As Haidt points out, the many American and European academics who fall into this trap need to readjust their sense of perspective.

Zizekian surrender: If the cynical tone of today's political debates doesn't drive you into hardcore relativism then there is always the path of Slavoj Zizek, a thinker once nicknamed "the Borat of philosophy".

Zizek supported Donald Trump ahead of Hillary Clinton in the US presidential election, and argued against Brexit in a grudging, Jeremy Corbyn-like fashion. But detecting a clear, Zizekian ideology is near impossible (Smith likens the Slovenian's life's work to "an unusually compendious joke book"). But in an uncertain world, it is precisely this kind of self-regarding buffoonery that's so tempting – treating everything as a game and downplaying the real-world consequences of one's standpoints.

You mightn’t have read Zizek but you’ll know of his conservative incarnation, BoJo the clown.