In Blog

https://boardagenda.com/2022/01/25/how-risk-blindness-threatens-the-effectiveness-of-board-decisions/

Published in Board Agenda 25 Jan 2022

by Garry Honey

A board’s highly sophisticated procedure for making risk decisions can often be undermined by a failure to identify risk in the first place.

While behavioural economics aims to understand why we often make irrational decisions, risk management aims to mitigate against unfavourable or undesirable future outcomes through better risk decisions.

The success of risk management is based on the accuracy of prediction. Philip Tetlock and Dan Gardner examined the art and science of prediction in their seminal work Superforecasting, which followed the Good Judgment project in the US to determine how the quality of judgment can best be improved through data and analysis. Delivering “good judgment” is what behavioural economics and risk management have in common.

My own work on risk decisions with boards has found that a highly sophisticated procedure for risk management is often undermined by a failure to identify or recognise risk in the first place. In some cases, risk is seen but not recognised; in some it is recognised but ignored or deemed otherwise acceptable; in some cases, it is seen but misinterpreted or recalibrated by a higher authority with different priorities.

There are many types of risk blindness, as indeed there are many types of sight impairment. Here are some possible diagnoses.

Risk myopia

In the case of risk myopia, the threat is not seen because it isn’t on the risk register; it lies beyond comprehension or consideration as a possible future outcome.

This short-sightedness prevents us from seeing the bigger picture because we are focused on risk types we know or have previously experienced. In heuristic terms this is familiarity or confirmation bias.

Within this category I also include long-sightedness of hyperopia, where some risks are not seen because they are so immediate or proximate that we don’t see them. They are right under the nose, hidden in plain sight. These are what Michele Wucker would call Gray Rhinos from her book of the same name. These are risks we don’t see because they are so obvious we look right through them.

Treatment for risk myopia and hyperopia requires an open mind and creative thinking, which are not always part of a risk manager’s armoury. Imagination and envisioning a range of future outcomes would help, and tools are available such as scenario planning and futures modelling.

Risk denial

In this case the risk is not seen because those defining risk within the organisation refuse to acknowledge it. This may be for cultural or ideological reasons, but there are definitely blinkers, which restrict a 360-degree view of possible risks. Those familiar with Margaret Heffernan’s book Wilful Blindness will recognise this type of behaviour.

Treatment for risk denial requires a level of honesty and objectivity often lacking in those with an agenda

A cultural example can be found in the financial services market in the years leading up to the crash of 2008. The financial risk of selling loans to people who clearly could not afford them was not seen because the debt was repackaged and sold as an asset class to third parties. The sales revenue from the loans, plus that from the collateralised debt obligations (CDOs) meant that the risk of loan default was obscured.

An ideological example can be found in the UK government pursuit of the policy of leaving the EU for populist reasons, not economic ones. The financial risk of leaving an established trading bloc was ignored, while there was political imperative to secure a mandate to govern for the next five years—albeit using the nebulous claim of taking back control and regaining sovereignty.

Treatment for risk denial requires a level of honesty and objectivity often lacking in those with an agenda or point to prove. As an estimated future outcome, and it is rare for two people to share the same vision of risk. Some are optimists who see the glass half full, others are pessimists who see the glass half empty.

Risk inertia

Risk inertia occurs when the systems are so complex that there is a confidence that the risk must be manageable, i.e. “We have it on our radar so that’s OK because the systems are designed to cope.”

This hubris leads to a failure to even stress test. Examples of this type of failure to see risk can be found in industries awash with safety systems so that a risk is often obscured by too much information, so that those monitoring the dials or gauges cannot react to a warning signal.

Boeing developed the 737 Max as a fuel-efficient plane and sold it to many airlines before two fatal crashes halted all flights from March 2019. The flight control technology had the facility to override pilot commands, yet this vital function was poorly explained, and not all pilots knew how to switch it off or work with it. It cost the lives of 189 people on Lion Air flight 610 in October 2018 and 157 people on Ethiopian Airlines flight 302 in March 2019.

Other well documented examples of inertia in risk decisions include the Challenger space shuttle in 1986 and the Texas oil refinery fire in 2005—also detailed in Wilful Blindness. The engineering industry has more examples of risk safety lessons from nuclear power plants such as Chernobyl and Fukushima.

Treatment for risk inertia requires clarity of communication: on a dashboard or control panel, where are the dials that are most important and which need to be taken most seriously? Are these located in the line of sight and calibrated so that danger is immediately obvious? It is the same with financial reporting and recognising danger signs in accounts, such as imminent insolvency.

Risk compromise

Risk compromise sees a threat downgraded or underestimated due to conflicting priorities within the organisation.

This is quite often when the cost of prevention is calculated to be unacceptably high, so the risk probability is re-appraised to justify downgrading severity. This is common in government infrastructure projects like Crossrail or HS2 where once begun, costs overrun. In terms of cognitive bias this can lead to escalation of commitment or optimism bias.

Risk probability is re-appraised to justify downgrading severity

In 2016 the UK Department of Health ran a stress test to cope with a surge in demand for services due to a novel virus. Exercise Cygnus found that the NHS would be overwhelmed and needed substantial resource upgrades to meet public demand. In 2020 when Covid-19 struck it was found that the risk had subsequently been downgraded and investment reduced by the Treasury.

This month the Department of Transport finally decided to halt the roll out of Smart motorways for safety reasons. Since their introduction in 2014 a total of 38 people had been killed in their stationary vehicles on smart motorways, yet the data used to support the roads policy was selective. The safety risk was ignored by a cost-benefit of removing the hard shoulder to improve traffic flow.

Treatment for risk compromise is to agree the necessary cost for a project and ring-fence this so that no subsequent financial or political interference can reduce project effectiveness. Easier said than done, but compromising public health and safety will ultimately have a political cost.

Risk misunderstanding

Some risks are unseen due to cognitive bias caused by the dynamics of the group and the environment in which risk decisions are made.

Time pressure can lead to patterning or availability bias, hierarchies and power politics can lead to dissonance reduction and groupthink. How many cognitive biases exist? One leading investment website lists 188 types.

Only by recognising and accepting bias can it ever hope to be neutralised

In their book Radical Uncertainty, the co-authors Mervyn King and John Kay draw attention to the importance of understanding the type of uncertainty you are trying to address in the boardroom. Is it a puzzle or a mystery? The former can be solved with information and effort, but the latter will always remain insoluble. Much time can be wasted trying to solve a mystery in risk assessment.

Cognitive bias comes from limited processing ability of the brain and its preference for seeking shortcuts to selecting information. Overlaid on these are emotional and moral motivations, social influence and our ability to store and retrieve memories. In short, these lead to irrational decisions boards call prejudices, preferences and politics. All boards have cognitive bias in risk perception.

Treatment for risk misunderstanding in the boardroom involves working with no more than 12 common biases, grouped according to how executives might meet them. For example, three personal ones like loss aversion or familiarity; three group ones like obedience to authority or dissonance reduction; three infrequent meeting ones like hindsight and patterning; three project progress ones like confirmation and over-optimism. Only by recognising and accepting bias can it ever hope to be neutralised.

In conclusion, I would urge anyone unfamiliar with it to read not only Daniel Kahneman’s Thinking, Fast and Slow, but also Dan Gardner’s Risk: the Science and Politics of Fear. The latter sets out the psychology of fear and how it underpins much of how we frame risk as a concept and toxic term.

Garry Honey is a writer on risk and uncertainty and is the founder of risk consultancy Chiron Risk.