Alliteration aside, the concept of “Safety” has been an overriding theme of the Summer of 2016. It seems as though most newsworthy topics over the past several months have sparked conversations about how we can find ways of better preserving and promoting personal and public safety. The industry in which I have spent the better part of two decades working has been no exception to this trend. In May of this year, the Medical and Scientific Journal BMJ published a John Hopkins study that indicated 250,000 deaths per year in America were due to medical error. This makes Human Error the third leading cause of mortality in the US (Makary & Daniel, 2016). Although a sobering figure, evaluating this type of data and its variables are critical to better understanding potential safety gaps and designing systems that better mitigate poor outcomes caused by them.
Over the years I have worked with many organizations throughout the US and abroad, helping them to develop and implement performance improvement infrastructures to improve their safety-related process outcomes. In doing so, I have often encountered two corrective action response approaches from organizational leaders that want to impact their safety related quality data; I call the first approach “Dystopian” and the other “Denial.” The Dystopian Safety Improvement method is frequently exemplified by a draconian lockdown on absolutely anything which could introduce risk regardless of how remote the possibility. The results of this can create service experiences for both staff and clientele that feel more Orwellian and punitive than secure and inviting. Ironically this approach can also lead to employee burnout which contributes to a rate of diminishing returns in safety measure effectiveness (Leiter, 1997). The Denial approach, in the context I am defining it, although not necessarily ignoring safety data, attempts to address it by layering other positive components into service delivery mechanisms. Although these amendments can be tangential to improving a product and/or service they are typically unrelated to actual gap or problem resolution. They instead serve as convenient distractions and a sort of, “Apart from that Mrs. Lincoln how did you enjoy the play?” approach to performance improvement. Luckily this is not the only leadership response I have witnessed when safety improvement outcomes become salient in an organization.
Proactive leaders, just like any good student of risk probability, understand that opportunities for performance improvement related to safety are an inevitable and continuous part of managing complex and dynamic systems. Organizations such as healthcare are particularly susceptible to this due to the type of services they are providing and the multivariate and mutable nature of the environments in which they are operating. One methodology that is proving to be especially useful in parsing both safety-related causes and outcomes and offering viable solutions for error mitigation is Cognitive Systems Engineering.
Cognitive Systems Engineering, which came into being in the late 1960s, was born out of the idea that increasing automaticity alone was insufficient to ensuring reliable safety in any complex process (Flach, 2015). It puts forth the notion that physical (environment/equipment) and logical (software/IT) systems must incorporate human-centered control design that facilitates performance goal-oriented behavior and reduces performance risk-oriented behavior in the human beings that are interacting with them. Furthermore, its analytic tools introduce easy to understand and visual bi-directional traceability between the “Why”, “What,” and “How” of safety outcome cause and effects. (Lee et.al. 2010). To summarize, it implements an approach to creating robust and evidence-based architectures that can reliably support safe outcomes within systems that are both flexible to changeable circumstances and not soul-crushing to the people using them to deliver or receive services.
I will be providing some examples of these tools and their usage in future posts along with evidence of how they can be absolute game changers in increasing safety and performance reliability in sustainable ways.
- Flach, J. (2015). Supporting productive thinking: The semiotic context for Cognitive Systems Engineering (CSE). Applied Ergonomics, Applied Ergonomics.
- Lee, Katta, Jee, & Raspotnig. (2010). Means-ends and whole-part traceability analysis of safety requirements. The Journal of Systems & Software, 83(9), 1612-1621.
- Leiter, M., Robichaud, L., & Quick, James Campbell. (1997). Relationships of Occupational Hazards with Burnout: An Assessment of Measures and Models. Journal of Occupational Health Psychology, 2(1), 35-44.
- Makary MA; Daniel M. (May 2016) Medical error-the third leading cause of death in the US. BMJ. 2016; 353: i2139
Lisa Sundahl Platt is the President and Founder of UMNSystems LLC. She writes about the systems and science of organizational and cultural transformation and how it impacts the human experience.