Risk Management as a discipline is well established yet boards frequently fail to see or act on risk which ultimately leads to crisis. Is this a failure of process, people or perception? There is growing evidence that it is perception: managing risk is like waging war on terror or catching clouds, an impossible Sisyphean task doomed to fail through unrealistic ambition.
Risk is a personal opinion of a future outcome and perception varies widely between individuals across the optimist- pessimist spectrum. Predicting the future is a fool’s game as there are only two possible outcomes: lucky or wrong. Boards don’t like to predict the future as they know the odds of getting it right are stacked against them, foresight as a skill is not practised or encouraged.
Risk is missed because the corporate future belongs to Strategy and the optimism bias inherent in convenient assumptions of a profitable and rewarding future. There are many reasons risks are not seen ranging from hubristic arrogance to downright incompetence. A study of risk-related crises over recent years suggests that people and systems can only act on what they see.
In my forthcoming book called ‘Unseen risk- why boards fail to see the obvious’, the basic premise is that clear risk perception is hampered by four main causal factors:
- Limited knowledge – we don’t see risk because the just don’t know about it.
- Risk blindness – we don’t see risk because we won’t see it (wilful or accidental).
- Cognitive bias – we see risk but under-estimate or misunderstand it.
- Psychological reactance – we see risk but fail to act on it through misplaced confidence.
The limited knowledge section pays homage to the infamous Donald Rumsfeld quote about ‘unknown unknowns’ but underpins how much of risk assessment is based on extrapolating knowledge into forecasts and estimates. Boards of course like to promise certainty as uncertainty is considered a weakness. There is a whiff of alchemy in turning uncertainty into certainty!
Risk blindness builds on the work of Margaret Heffernan in her book ‘Wilful Blindness’ but draws a distinction between wilful and accidental blindness. The former deriving from a refusal to recognise a risk for ideological reasons, while the latter caused by information overload – too many signals or ‘cockpit confusion’. We examine different types of blindness from ‘won’t see’ through to ‘can’t see’.
Cognitive bias is a problem many boards recognise and some even consciously try to neutralise, they know that avoiding dissonance and rewarding consensus hinder good judgement. This book highlights four heuristic categories relevant to risk blindness: bias at personal level, team level, board level and project level. We examine three examples in each category giving twelve types in all.
Psychological reactance is the failure to act on the evidence because we have convinced ourselves a risk is manageable: ‘We have it on our radar so that’s all right, our systems are designed to cope’. This is risk inertia based on arrogance or hubris, but sadly all too common in a world awash with risk management systems, convincing on paper, but rarely stress-tested in situ.
My new book ‘Missed Risks’ identifies several types of unseen and unspoken risks in an attempt to show boards where risk perception is critical. Hindsight is a wonderful thing but decision makers need greater vigilance and sharper risk perception. Remember, risk exists to ensure we make good judgements; the key is a balanced view of the future based on reducing uncertainty.