The black box in this article are those installed in planes. Those black box allows to analyze what happens during a more or less dramatic incident. Setting an organization where facts are analyzed to improve the system is a defining characteristic of airlines industry. A characteristic that enables an incredible improvement of security.
Matthew Syed starts by showing this fundamental difference in his book: Black Box Thinking – Why Most People Never Learn from Their Mistakes–But Some Do.
Compared to the health-care system where the culture is to blame people that made mistakes. A culture that leads to the dissimulation of mistakes and to no improvement of the systems, causing 400,000 deaths per year in the US (third cause of mortality before traffic).
Matthew explains the anthropological and psychological root causes that leads us to avoid recognition of mistakes, and then to incapacity to conduct improvements.
One main aspect is the mindset in which we look at the development of our skills. As stated by Carol Dweck there’s two types of mindset: fixed mindset, where we think our skills depends of our gene, and growth mindset, where we think our skills depends on practices.
The book presents transparency approach from people and organization that allows experimentation avoiding our natural biais.
One approach is marginal gains, the fact to change small things to correct a small weakness, that will lead at the end to high performance.
Another approach is to avoid closed loop thinking that leads to deny mistakes reallity, and then to reproduce those mistakes over and over again.
As stated before, avoid the blame culture is necessary to create the conditions that allows people to reveal their mistakes.
So we can try and try again a lot of times to finally succeed.
A must read!