Cognitive Dissonance

One of the benefits of a lazy summer vacation is the opportunity to finally read the books you were gifted the Christmas before. One of my more insightful reads, whilst on my sun lounger, was “Black Box Thinking” by the accomplished author and journalist Matthew Syed. The book introduced me to the concept of Cognitive Dissonance, which essentially is our brains reaction to the way we feel when our beliefs and chosen actions are challenged by evidence. To acknowledge we were wrong affects our self-esteem and the magnitude of the impact is commensurate to the nature of the issue.  To overcome this negative feeling our brain allows us to reframe the evidence. We can filter it, spin it or simply ignore it, all in a desperate attempt to absolve ourselves of a poor decision – in other words we are in denial. It has been proven that cognitive dissonance is a deeply ingrained human trait.  The ultimate issue with this embedded human defence mechanism is the failure to acknowledge our mistakes and more importantly to learn from them.

Now, on minor issues a poor decision will have little impact, but if people’s lives and livelihoods are at stake the need to reframe and spin the evidence becomes much greater.    Syed’s book is filled with a wealth of research and range of studies exploring major flaws with governments and industries, such as the health service and the judicial system where failures result in the loss of life or wrongful imprisonment. For the surgeons, doctors, judges etc. at the centre of these failures it would be extremely damaging for them to acknowledge that they had made mistakes, so to protect their positions and self-esteem they create narratives that make allowances for the unfortunate outcomes of their poor decision making.

The book provides a glimpse into the inextricable connection between failure and success by contrasting two of the most safety critical industries in the World today: healthcare and aviation and the profound differences in their divergent approach to failure.  In the airline industry every aircraft is fitted with two indestructible black boxes so that the root cause of every incident can be ascertained and lessons learnt, to ensure the same error can never happen again. As a result of this approach the airline industry has achieved an impressive record – in 2013, there were 36.4 million commercial flights worldwide carrying more than 3 billion passengers, with a handful of accidents resulting in 210 people dying. In healthcare, the statistics are very different – in the UK a report on the NHS by the National Audit Office in 2005 estimated that up to 34,000 people are killed per year due to human error.   These statistics are alarming and will continue unless a different approach is taken with regards to the root cause of these failures.

If this snippet has triggered your imagination I urge you to go read the book.

But why, I hear you ask, have I used my blog for this subject – the answer is straightforward. I see many parallels with the way our industry deliver projects and the tendency to reframe the evidence in the face of projects being delivered late or over budget, and as a result we lose the opportunity to learn from our mistakes. Project Managers make a raft of key decisions all of which will result in either the success or failure of the project. By learning the lessons of previous projects our Project Managers will be armed with the knowledge and insight to approach and deliver projects with the benefit of hindsight.  The more we learn from the past the less our Project Managers will rely on cognitive dissonance to reframe their failings.  We must also examine our response to failure and create an environment where people do not feel that acknowledging their failures will be punitive to their careers.  The most progressive industries and companies are those that accept and respond to failure in a measured and informed manner in order to increase the future likelihood of success.

Date: 31/08/2017
Category: Steve's Blog