How to make better decisions in complex situations

Blind spots, biases and a fear of uncertainty hinder the updating of our mental models


Brexit confronts businesses of all kinds with significant challenges and opportunities. It has no precedents, future developments are difficult to predict, and no one knows when and how it will play out. In politics, business and life we often face small and large events with consequences no one can foresee and no one fully understands.

When navigating in such dynamic and complex environments, a common piece of advice is to “expect the unexpected”. Sounds good, but it is unhelpful and logically impossible. As soon as we expect something, it is not unexpected anymore.

Instead I suggest to individuals, teams and organisations that want to improve their decision-making in complex and changing situations to “unexpect the expected”. But that brings it very own difficulties.

In general we make decisions based on mental models that reflect our understanding of the world we live in, and the particular circumstances we face. Such mental models – which can take many forms, such as implicit schemas, abstract representations or compelling narratives – are usually pretty accurate in stable situations with which we have ample experience.

READ MORE

But when we try to make sense of new and changing realities we over-rely on our existing mental models. Crucially, we often fit emerging new facts into existing models and narratives that are tweaked until a plausible fit has been achieved. But those minor changes usually don’t capture the essence of the new reality.

We crave certainty

To successfully “unexpect the expected” requires that we rid our thinking of many of the certainties that normally enable and enhance it. That is easier said than done because we all crave certainty. In fact, we never want it more than when we are making important and consequential decisions in uncertain and changing conditions.

That’s why we fall back on old models that seem to promise understanding and certainty.

That’s dangerous because they do not reflect the new realities. Especially under stress, we rely more on entrenched views and information already available and easily accessible to us (eg facts we already have vetted and accepted), are less open to new information and input, and become even less able to detect and confront our own biases.

And most of this happens automatically, without us being aware of it. Unexpecting the expected is not easy.

So what can we do?

First, we can actively adopt a disconfirming perspective. Sceptics do that as a matter of course. They question all facts and assertions, and look for overwhelming evidence before they accept them.

What sometimes comes across as being negative – “I refuse to believe this” – is actually a reflection of positive and constructive engagement. The more we try and fail to disprove a view, a fact, a conclusion, the stronger the support for it becomes. This is how science should work – a basic truth too often ignored.

Multiple perspectives

A second strategy is to use different, ideally multiple perspectives. In teams and organisations, diversity of age, gender, culture, education and so forth is a real strength because the inherent differences in experience and perspectives means that people bring different mental models to the task.

However, diversity in itself is not enough. Different, and especially minority, views and voices must be amplified to be heard. If not enough different views are available they can be created, for example by appointing a devil’s advocate, whose explicit role it is to challenge any emerging consensus.

Such approaches can be powerful but work only if there is no implicit or explicit favoured view among powerful or high-status members.

Another way to challenge our mental models to improve decision-making is more playful. This involves counterfactual and counter-causal approaches to challenge established ways of thinking.

Counterfactual challenges represent a serious attempt to ask and comprehensively answer “what if?” questions that do not rely on current understandings, available facts, and the conclusions already accepted. An example would be to pose the question: “What if the UK leaves the EU tomorrow?”

The more outlandish such questions initially appear, the more robust they will challenge our established individual and shared mental models.

A similar approach involves “flipping” the direction of assumed causal relationships. If we all accept that overeating causes obesity, what if we “flip” this cause-effect link and consider how obesity may cause overeating? The resulting explanations may not be correct but creating and considering them will always deepen our understanding and helps to surface hidden assumptions that root us in obsolete mental models.

Strong reactions

Both within ourselves and in groups, and especially in the workplace, it is useful to be attuned to special sensitivities or strong reactions to counterfactual and counter-causal challenges. These offer useful pointers to assumptions that, while difficult to challenge, may benefit from additional scrutiny.

Such reactions do not emerge when we trust the facts. They arise when we implicitly base our arguments on objectively unsupported or ideologically motivated assumptions.

These strategies can add value, but none of them provides a comprehensive solution to the difficulty of making decisions in a complex and changing world.

Our mental models need updating, and our blind spots, biases, implicit preferences and aversion for uncertainty stand in the way.

The more actively we embrace these change- and learning-oriented approaches the more we enable our own adaptation to the complex and constantly transforming world we operate in.

Dr Martin Fellenz is associate professor in organisational behaviour at Trinity Business School, Trinity College Dublin