Medicine news

Human factors in medicine: The discordant doctor : Emergency Medicine News

Figure:

operational risk management, burn-out

FU2-12
Figure

Am I the same doctor at the end of a busy shift as when I first walked through the doors 12 hours earlier? Does my last patient on a busy shift have the same knowledgeable physician and similar exam as the first patient I saw?

I wish I could confidently say yes, but unfortunately the answer is no. And that troubles me.

In an ideal world, all of our patients receive the same high-quality, evidence-based care, regardless of environmental conditions. Many factors influence a patient’s experience, which in turn leads to the correct diagnosis and therapy in a busy emergency department – the time of day, day of the week, physician experience, particular distractions upon patient arrival, and physician fatigue and fatigue. patient load, not to mention a host of other factors.

Imagine a PE working on a pulmonary embolism. The new physician at the start of a shift will likely stratify the patient to quantify the likelihood of pretesting and deem the risk so low that they do not deserve a workup, order a D-dimer, or go straight to imaging. Now imagine the same physician caring for 15 additional patients with a stack of ECGs to review that was just interrupted by a traumatic activation in the middle of PIH acquisition.

He can simply command the CTPA as he flees to assess the new trauma patient, no matter how likely the pretest is. Worse still, he may be so distracted that he fails to consider a diagnosis of PE. This is clearly not ideal. We owe it to these two patients to guarantee the same thoughtful assessment and the same quality of care.

Operational risk management

Many physicians never consider their practice to vary significantly from day to day, let alone within a single shift, but that should come as no surprise. This phenomenon affects highly skilled and well-trained professionals in all industries. Research conducted more than a decade ago at Ben-Gurion University of the Negev and Columbia University illustrated behavior termed lunchtime indulgence in which judges were found to be much more lenient in granting parole in the morning and just after lunch, but much tougher on prisoners who saw the judge more as time passed following a break. (Proc Natl Acad Sci USA. 2011;108[17]:6889; https://bit.ly/3CPBjyw.)

This study found that an inmate was three to six times more likely to be granted parole if their case was one of the top three of the day compared to the bottom three. It hardly looks like justice. Behavioral economists have documented many other examples of subconscious behaviors.

Aviation is a profession that truly appreciates the limits of human performance. A whole field of study called human factors operates within aviation and other high-reliability organizations (those in high-risk, hypercomplex environments like space operations or nuclear power). This discipline identifies human capabilities and limitations and mitigates distinct characteristics leading to undesirable behaviors, reduced performance, and increased risk of failure. It’s a big topic, but a relevant example is the use of operational risk management (ORM) in combat aviation.

Each time fighter pilots depart for their jets, they are required to do a final check as a group. One step in the payment process is to think about and publicly communicate your ORM (level of risk) based on specific factors. Am I sick or injured? Do I feel well rested? Am I distracted? Are there things going on at home that overwhelm me? If the response is “High ORM”, meaning that an individual’s risk outweighs their potential benefits, the lead pilot can mitigate it by reducing the complexity of the mission or, in rare cases, canceling the mission altogether. the exit.

Where in the medical school or residency curriculum did lectures train physicians to consider their operational risk or general ability to perform their primary tasks competently? Instead, medical students and residents are thrust into a system that seems more inclined to deny the very existence of human factors. We are taught that physicians can and should demonstrate absolute professionalism and competence at all times, despite physiological stressors or psychosocial factors. The message? Power through.

Higher medical education remains dependent on an environment that forces trainees to be chronically sleep deprived despite clear evidence that sleep deprivation degrades performance. Studies have shown that sleep deprivation is proportionally equivalent to an increase in alcohol intoxication. (Occupy Environ Med. 2000;57[10]:649; https://bit.ly/3qa6cIu.)

Limit human factors

The tide seems to be turning as burnout becomes more prominent in the medical literature, but this concept is not exactly the same. Burnout is about the well-being of every physician; it’s softer, more nebulous. Human factors involves considering the intrinsic and extrinsic factors that influence a person’s ability to achieve peak performance.

Additional goals are to reduce errors and standardize best practices. How do we create processes, safeguards, and systems to help the human physician consistently deliver the same high-quality, evidence-based medicine despite the unique challenges I’ve described? It is not an easy task.

Healthcare has traditionally underestimated human factors compared to other professions in which the safety and well-being of humans is directly tied to outcomes. Perhaps we are too busy with the constant flow of sick and injured patients to think about how we can hone our skills to optimize individual performance. However, we can limit the human factors that lead to practice variation and physician error by:

  • Improve the individual through intentional training.
  • Augmenting the individual through process and technology.
  • Modifying the individual’s environment to reduce barriers to optimal performance.

Training medical students and residents to optimize performance and limit errors with a focus on human factors is one way to systematically move healthcare towards a more trustworthy culture. Concepts such as optimizing communication in complex high-risk environments through crew resource management training or implementing operational risk assessments are just two concrete examples of useful training. on human factors. The fallibility of human cognition can be augmented and overcome using evidence-based decision tools, checklists, and AI-based algorithms. These measures will help ensure patients have similar experiences and outcomes despite the environmental conditions in a chaotic hospital or high-volume emergency department.

Industrial engineers and creative leaders can also analyze and come up with new solutions to change an environment and reduce inherent obstacles, disruptions, and distractions to make it harder to fail. It could be as simple as creating a process for determining when and by whom ECGs will be interpreted rather than projecting them into the faces of concerned doctors.

The simple realization and gentle reminder to myself near the end of a busy shift that I am exhausted, distracted, and saturated with tasks forces me to recalibrate my decision-making physically and mentally so that I can fully check in the quality of my checkups and providing that last patient with the ED experience he deserves.

Share this article on Twitter and Facebook.

Access links in REM reading this on our site: www.EM-News.com.

Comments? Write to us at [email protected].

Dr. Jedickis a board-certified emergency physician who works in the Las Vegas emergency departments and as a clinical professor at the University of Utah. He also practices aviation medicine, previously serving as an active duty flight surgeon in the US Air Force with several fighter squadrons and now in the Utah Air National Guard. He is also an FAA Aviation Medical Examiner and previously completed a space medicine internship at NASA. Follow him on Twitter@RockyJedickMD.