q Unseen risk: Why boards often fail to see the obvious - Business Reporter

Unseen risk: Why boards often fail to see the obvious

Garry Honey at Chiron Risk describes why it’s crucial for boards to understand why they are missing risks and how they can identify them more effectively

Risk is a priority topic in the boardroom. Investors and regulators demand clarity on appetite, employees and customers demand clarity on control systems.

Our risk management industry is booming and yet boards frequently fail to see, or act on, risk which ultimately leads to crisis. Is this a failure of process, people or perception? Why is the news regularly filled with stories like Carillion construction or Greensill capital where risk was hidden in plain sight?

Could it be that risk identification rather than risk management is at fault? And if so, why do we fail to identifyrisk?

Recent work on skills development for board directors carried out by a leading a business school suggests that the answer lies partly with boardroom culture and partly with behavioural science, a noxious blend of how we operate and how we think.

Boards and leadership teams like to make sound decisions based on informed judgement employing evidence and assumptions. And they like to make them quickly.

Our brains help to make decisions quickly; but the way we think is subject to the ‘wiring’ in our brains which can distort rational thought. Instead we instinctively seek quick and simple answers. Still, boards rarely tolerate uncertainty. They like to make quick decisions, even if they are the wrong.

Wrong decisions are often made because of risks that are unseen and that are consequently ignored.  This failure to se risks can be traced to one of three basic causes: limited knowledge, risk blindness, and cognitive boas. For unseen risks is to be revealed these causes need to be understood.

Limited knowledge

Risk perception is impaired by limited knowledge when we don’t have enough information and we don’t know where to look for it. We recognise that there are things we don’t know:  the ‘known unknowns’. But more dangerously there also things we don’t realise we don’t know: the ‘unknown unknowns’.

Risk that remains unseen due to limited knowledge reveals itself with the passage of time as events unfold to display how much the reality diverges from the original estimate. While nobody can foretell the future, there are lessons for boards regarding information gathering and estimate revisions.

Too often a board will stick with an inflexible policy, because it was ratified at a board meeting rather than admit an error of judgement and adapt policy as better information emerges.

There are many case studies to show how limited knowledge leads to unseen risk. One I like to use is the UK government’s response to Covid-19 in Spring 2020. At the time there was a good deal that the government didn’t know.

  • They didn’t know how far the virus had penetrated the UK population and had to guess
  • They didn’t know how the virus was entering the country: the assumption was it was from China but it was actually from Spain, France and Italy
  • They didn’t know how the virus was transmitted: early warnings were about surface contamination and glove wearing, when the reality was that the virus was airborne

Many of the public health protection policy decisions at the time were based on assumptions that were later proved wrong, but they were made due to the urgency of reaching a policy decision. This kind of limited knowledge is unavoidable in a crisis.

However, it is important to adjust policy in accordance with new information. Doing this should be seen as a sign of strength and confidence. As the economist John Maynard Keynes said ““When the facts change, I change my mind. What do you do, sir?” Boards therefore must be flexible when they receive new information.

Risk blindness

Some risk remains unseen because we can’t see it through an overload of information or because we won’t see it because our belief system won’t permit us to recognise it. I call the former “accidental blindness” and the latter “wilful blindness” (with grateful acknowledgement to Margaret Heffernan for her book of the same name.)

Wilful blindness

Wilful blindness might sound unlikely, but can be seen in politics and business. Take the financial mindset that led up to the 2008 global crash: wilful blindness was shared by banks and mortgage lenders in both US and UK markets in the years leading up to 2008.

Domestic property loans or mortgages were seen to be a lucrative market for lenders who moved into high risk sub-prime loans chasing sales targets at the expense of conventional loan approval criteria. Selling mortgages became a highly attractive business as the risk of loan default or delinquency could be transferred through financial instruments such as Collateralised Debt Obligations (CDOs).

However, eventually the inter-bank lending rate froze and the money-go-round stopped. The culture of lenders, driven by their wilful blindness to the risk of delinquent loans, caused even large, long established and stable banks huge problems.

The demise of Washington Mutual is an example of this. The bank had operated for a century as a traditional savings and loan bank. But with the arrival of a new CEO from 2003 it focused on selling high risk loans at the expense of more prudent lending.  It collapsed in 2008.

Accidental blindness

These are risks that are unseen due to an overload of information. In the worst cases, safety-critical information is obscured by other data feeds.

Boeing developed the 737 Max as a fuel-efficient plane and sold it to many airlines before two fatal crashes halted all flights from March 2019. The flight control technology had the facility to over-ride pilot commands. However, this vital over-ride function was poorly explained in pilot training, and not all pilots knew how to switch it off or work with it. Boeing underestimated the risk of pilot error caused by inadequate briefing of the new handling characteristics.

This failure cost Boeing several billion dollars in cancelled sales and penalties from its regulator the Federal Aviation Authority. It also cost the lives of 189 people on Lion Air flight 610 in October 2018 and 157 people on Ethiopian Airlines flight 302 in March 2019.

This is the human cost of risk blindness.

Cognitive bias

Cognitive biases are errors that people make because they interpret the world according to their own “subjective reality”. They are often the result of trying to simplify information processing or accelerate decision making. And they are made worse by the tendency of the brain to overlay emotional and moral motivations, social influence and memories onto the problem we are trying to solve rationally.

Cognitive bias leads to irrational decisions that are sometimes labelled as prejudices, preferences and politics. One of the most significant for board decisions is ‘Groupthink’ where perceptions of risk are framed by the collective mindset of the group to the exclusion of better scrutiny. This is often similar to wilful blindness.

Groupthink was first identified in 1961 among senior US military planners who had tried to overthrow the communist regime in Cuba. The infamous ‘Bay of Pigs’ disaster was analysed retrospectively on the order of President Kennedy to prevent a repeat of such narrow thinking. Sixty years after the Bay of Pigs, it seems likely that groupthink was to blame for a failure to countenance the rapid fall of the government in Afghanistan and the consequent failure to plan properly for civilian evacuation.  

Seeing the unseen

So what can you do to make risk seen?

Each of the three causes can be tackled. Limited knowledge is always a challenge, imperfect or incomplete information is a major contributor to crises, yet boards often feel under pressure to make decisions based on the best information to hand. The best advice is to ask: ‘what is to be lost be deferring a decision while awaiting newer and better information?’

In the case of wilful risk blindness you need to inspect the organisational culture, beliefs or ideology that determines the perception of risk. Is this sound or is it distorted by a fixation on sales or profit at the expense of other values? For accidental blindness it is a matter of creating warning systems that are fit for purpose and actually deliver an unmistakable warning despite other distractions!

In the case of cognitive bias, there are many that often combine to distort the perception of risk so neutralising bias is a persistent challenge. It requires a determined effort to deliver only rational decisions and ignore all temptation to be irrational. Remove emotional, personal and political drivers among board members, and strive to be objective and balanced when identifying risk as a board.


Garry Honey is the founder of the risk consultancy Chiron (www.chiron-risk.com) and the lead tutor at Henley business school on the Board Director Practice (BDP) masters programme on ‘what boards need to know about risk’.

Main image courtesy of iStockPhoto.com

© Business Reporter 2021

Top Articles

Reforming upskilling strategies for the changing work landscape

Leaders across industries must upskill the workforce to deliver new business models in the post-pandemic era

Green or greenwashing?

Procurement must stamp out greenwashing from supply chains, to ensure that organisations’ products and goals are not just a “green…

American View: Why Do Cultural Taboos Frustrate New Technology Implementation?

Businesspeople seldom evaluate new technologies on capabilities alone; why do peoples irrational beliefs impede attempts to discuss worthwhile innovations?

Related Articles

Register for our newsletter

[ajax_load_more loading_style="infinite classic" single_post="true" single_post_order="previous" post_type="post" elementor="true"]