Poor reasoning and weak arguments appear practically everywhere. Discussion and analysis, both in print and online, are all too often riddled with dubious reasoning. The more contentious the issue, the more extreme the flurry of dodgy claims it generates. Flawed logic is ubiquitous, ranging from genuine ignorance to odious dishonesty. It can all too easily derail constructive discussion and damage our ability to make sound decisions.
Here are some of the common reasoning flaws found in modern discourse, which are scuppering our ability to make good decisions.
Cause-and-effect fallacies
Umbrellas are associated with rain but don't cause it: the logic seems pretty clear, but such nuance is thrown to the wind in discussions about science, health and politics. It can be surprisingly hard to make a robust causal connection; there are usually so many confounding variables that it takes a carefully controlled analysis to work out the underlying relationship, if any.
This doesn’t seem to stop people making erroneous links and disproven connections, whether it be the assertion that vaccinations cause autism, fluoride causes Alzheimer’s or homosexuality causes Aids.
These are also examples of the reduction fallacy, an often misguided attempt to ascribe single causes to outcomes that are in reality complex interplays of many factors, such as some newspapers’ ongoing quest to divide the entirety of creation into a neat “causes cancer/cures cancer” dichotomy.
Causation fallacies are so versatile that they can be employed even without any evidence of correlation, just an asserted order of events. This related fallacy is “Post hoc ergo propter hoc” (“after this, therefore because of this”). This is used to brilliant comic effect by the parodic Church of the Flying Spaghetti Monster to assert that a drop in the number of global pirates since the 1800s has caused global warming.
It is possible to engage in the same fallacious reasoning in reverse, by denying the causative role of an agent that has been rigorously shown to cause a given effect. Sixty years ago the tobacco industry attempted to pour doubt on the emerging scientific consensus that smoking caused lung cancer.
Similarly, many people today deny the link between human activity and climate change, despite incontrovertible evidence.
Personal biases and motivated reasoning
Humans are social animals, and information from and the opinions of people around us tend to colour our own. Despite this, anecdotal assertions are perhaps the most common signs of a dubious argument.
Being casual observations rather than rigorous controlled tests, they usually fail to consider confounding factors or alternative explanations. A person who insists that homeopathy works for them, for example, may be unaware of other, more likely explanations – placebo effect, regression to mean, observer bias – which describe the same thing without requiring an ad-hoc rewriting of the laws of physics. Anecdotes are persistent and difficult to discount precisely because they are personal stories, far more engaging than dry scientific data.
They also tend to be noteworthy, and so distort our perceptions of the average case. This is why testimonials are so successful in advertising, despite their vapidity.
Unfortunately, this human element insulates them from detached analysis, as reasonable alternative explanations can be viewed as personal attacks on the veracity of the anecdote-sharer.
We also have an affinity for information that validates our preconceived notions; a common form of cognitive duplicity is confirmation bias, where morsels of information reinforcing one’s prior beliefs or world view are accepted as evidence and those contradicting it ignored.
False balance and false dilemmas
"There are two sides to every story" is a sayting that contains a modicum of wisdom. It is true of many subjective human interactions and situations, but problems arise when it is applied to entirely objective situations. Just because there are two competing approaches does not make them equally likely.
Often one is buttressed by swathes of theory and supporting experiments while the other is bereft of explanatory power or corroborating data. In such cases it’s fallacious reasoning to treat them as equally reasonable.
False balance is often a Trojan horse to lend a veneer of respectability to ideological positions unsupported or contradicted by the data.
One example is drives by American religious conservatives to have creationism taught alongside evolution in classrooms, despite the fact that only the latter is supported by the overwhelming weight of evidence. To treat them as equally likely is absurd.
False balance is all too commonly given legitimacy by an uncritical media and public. This can lead to a manufactured controversy, a quasi-Machiavellian system of creating an illusion of doubt over established fact, so those whose position is tenuous can demand with a straight face that the media report “both sides of the story”, which they almost invariably do.
The aim of false balance is to neutralise the weight of the evidence, acting as a vehicle for ideological or profoundly unscientific views. This is devious but effective, and has the net effect of corroding public understanding of science if uncritically reported.
At the other extreme, “false dilemma” is an equally common logical fallacy in public discourse; it artificially polarises division to the extremes when a spectrum of options is actually available.
You’re-with-us-or- against-us sentiment peppers historical and political discourse from Jesus to Mussolini to George W Bush, and is transparently disingenuous when multiple options and shades of grey exist.
The solution These are but a handful of common logical fallacies, which all too often shape narrative and lead to poor decisions and outcomes. Identifying such flaws is important, but a bigger question is how might we correct them?
Cause-and-effect fallacies can be difficult to dislodge, as they usually rely on a simple explanation for a multifaceted event, but could perhaps be countered by better understanding of the way simple correlations are rarely the full picture, even when there is some kind of relationship: drowning deaths tend to increase alongside ice-cream sales, but it’s good weather rather than Ben & Jerry’s that is the causative agent in these accidents.
Combating motivated reasoning is exceptionally difficult, as it is often an evolved mechanism of “protecting” ourselves from the distress of cognitive dissonance. Perhaps rather than view conflicting concepts with apprehension, we should regard it with the exhilaration of discovery.
There is nothing wrong with being wrong, provided we are willing to adapt our views in the light of evidence and constantly revise them. Black-and-white thinking can be challenged by noticing that there are more than just binary positions on issues or people, and a spectrum lies between those extrema.
The blade of correction should cut both ways; we should aim to spot reasoning flaws not only in the arguments of others but also in our own logic, even when this jars us.
Of course, we cherry-pick data to “prove” our points, but perhaps we can make more of an effort to think like scientists, and aim not to prove our own hypothesis but to challenge it, accepting it as tentatively useful only if it can withstand critical onslaught.
We should feel zero shame in revising our positions in the light of new evidence or understanding. This is how progress is made.
Dr David Robert Grimes is a science writer and a physicist at Oxford University. He blogs at davidrobertgrimes.com. On Twitter he is @drg1985