When doing a root cause analysis either for an Adverse Patient safety Event or a Near Miss (a Good Catch), the patient safety committee often faces the daunting task of finding solutions to prevent the recurrence of the incident/ situation. Long term solutions often necessitate system changes. Reprimanding humans or punishing them often does not prevent the error from recurring. Swatting the flies does not solve the fly problem. Though when willful neglect and deliberate non-compliance is found, punishment is warranted (Just culture).
When discussing system solutions, it is very easy to get fixated on the problem at hand and to develop tunnel vision. Just focusing on the problem often leads to weak solutions and to system fixes that often fail. It is important to take a few steps back and look at the whole picture. The incident that we are studying should be looked at, as being a part of a larger process. Very often the solution lies in fixing another step before or after the step that went wrong.
Let me illustrate this with two beautiful examples. Firstly, let us look at how we would find solutions if we investigated the 1912 sinking of the Titanic and the loss of 1500 lives. It would appear that the only solution lies in preventing the ship from hitting the iceberg. If we think out of the box we would find other atypical solutions. Had the steel used in building the ship been thicker, it could have withstood the iceberg hit and would not have sunk! If the number of lifeboats were sufficient, everyone could have escaped and survived even if it sank (FMEA Analysis at work !!!). So solving a problem may involve mitigating the damage even if the incident were to recur. Secondly, let us look at the Hubble Space telescope which was deployed in 1990 and was doomed from the beginning because the mirror used in the telescope was of a wrong focal length and images being obtained were blurry. So what was the solution? Change the mirror? An impossible task considering that the telescope was in orbit around the earth. Instead scientists found an out of the box solution. They applied corrections to the sensors so that the error in the mirrors could be reversed. The solution lay in the sensors and not in the erroneous mirrors. These are important lessons about how we approach our own healthcare problems.
Pharmacy dispensing errors may require a look at how the medications are arranged in the Pharmacy or how the font looks on the medication labels or if LASA precautions are being followed or whether a double check is being implemented. Nursing errors may need staffing, work load and human factors to be examined. Very often a flawed policy & procedure may be the culprit. When we find many people taking shortcuts and bypassing a policy repeatedly, maybe it is time to re-examine the policy and listen to the healthcare workers about what is wrong with the policy. A very well written policy is worthless unless people follow it. Non-compliance is a serious problem when dealing with patient safety events. Often it is due to either ignorance or due to resistance to change. Both require action to remedy them. People tend to forget and this problem is further magnified due to staff turnover. Ebbinghaus forgetting curve is very much in action in hospitals. Therefore re-education and re-orientation to hospital & departmental policies should be an ongoing continuous affair. Random Safety Audits are excellent in checking compliance and at the same time educating staff about hospital practices.
Sponge counting for normal vaginal deliveries in the Labor ward could be slowly forgotten and stopped when the head nurse changes and new staff nurses replace the ‘senior’ nurses. Laboratory result error may actually be a problem in the software which feeds the results to the hospital information system. It could be a units mismatch ! It could have been a pre-analytical error. Maybe a wrong patient label attached to the wrong blood tube by a second nurse who tried to help her busy colleague. A wrong blood group could have been due to machine-human interface problems…may be a parallax error while reading the table of results in the Tango machine. A wrong drug dose may be due to a tired staff nurse who was at the end of her shift and was asked to compute a medication infusion dose which she had never handled before and which was not double checked by her colleague and which was not verified by the pharmacist ( Swiss Cheese model in action !)
The first step in finding solutions while doing an RCA is not to rush to conclusions. There is never a ‘guilty’ person. A thorough examination of the ‘sequence of events’ along with detailed analysis of direct and indirect factors will often lead to multiple potential solutions. Most of the reported patient safety events often can be prevented by well thought out system ‘fixes’. However, to find the optimal fixes requires the whole story to be thoroughly studied. Expect to find the ‘optimal fixes’ in the most unexpected of places. Let us work together to ensure that “No one should be harmed in healthcare”.