June, 5th 2016
The vast majority of the data analysis in safety and quality assessment in the healthcare industry is retrospective. These would include sentinel event analysis, incident reporting data, root cause analysis, peer review, trigger tools and even complaints from patients. All these system analysis events are viewed historically and are subject to hindsight bias, outcome bias and confirmation bias which have been studied extensively. It is critical for those doing quality and safety work to recognize the inherent nature of our thinking and predilection to be overly critical and unrealistic about the foreseeability of untoward outcomes.
The ability to learn from previous errors and missteps is crucial. The need to retrospectively fairly assess events is equally important so as to not blame those for outcomes that could not reasonably be foreseen. The balance is seen not only in healthcare safety assessment but in all industries and in everyday life. The very familiar phrases, “Monday morning quarterback” and “hindsight is 20/20” encapsulate the bias very clearly. Favorite sports teams are unjustly chastised all too often in the context of outcomes bias.
Very interesting research has documented the phenomenon. One such experiment was performed asking individuals to identify the subject of pictures, famous people, Jesus, Muhammad Ali, etc., which were presented early in the experiment with a few pixels and then shown with progressively more pixels until the full picture was revealed. The percentage of pixels needed to identify the subject was documented. The subjects were then shown the same series of pictures which were progressively filled in and asked to identify at which point they were able to identify the subject. Having seen the pictures earlier they consistently identified the point at which they were able to identify the subject earlier than they had in reality previously recognized the subject. Lastly they were told that there is a phenomenon called retrospective bias which caused people to guess the time they identified pictures earlier than they actually did, and for the next questions to account for the tendency to choose earlier knowing this fact. The experimental subjects still consistently chose the time they were able to identify the pictures earlier than they had in reality identified the subjects. Similar research has confirmed the effect of hindsight and outcomes bias in the accounting, auditing and medical diagnosis field. The logical conclusion is that when you know the outcome and very likely when you anticipate a negative outcome you are very likely to be biased significantly by the knowledge
All those who work in the field of safety, quality, risk management including C-suite, peer review committees, RCA groups, boards, managers need to be aware of the cognitive bias present. Focus should be placed on the research that even being cognizant of the bias, we are still overly critical and overconfident in our assessment. When possible we should blind reviewers to outcome or at least include scenarios without a negative outcome to minimize bias. At a minimum those involved in retrospective analysis should be presented with a scenario with a desirable outcome and ask for analysis of possible problems and opportunities to improve to reveal our inherent bias of negativity and criticism in our roles as retrospective reviewers. Understanding and awareness of cognitive bias including, hindsight bias, confirmation bias and multiple other heuristics which distort objective retrospective analysis is important for all those in the field of quality and safety. Blame especially using the retrospectoscope should be meted out very judiciously.
Copyright © 2016 Nicolas Argy, MD, JD
Nicolas Argy, MD, JD