Crisis Resource Management in the PACU

section epub:type=”chapter” id=”c0020″ role=”doc-chapter”>




4: Crisis Resource Management in the PACU



Thomas Corey Davis; Suzanne M. Wright




Keywords


crisis resource management; simulation; high-fidelity simulation; human patient simulators; PACU


Postanesthesia care units (PACUs) are error-prone environments due to high cognitive burdens and stress loads, high noise levels, excessive demands on attention, and production pressure. Similarly, the conditions of patients entering the postoperative period are characterized by complexity, uncertainty, and instability. Perianesthesia nurses receive patients with a wide range of comorbidities after a variety of surgeries and other procedures and must be prepared to manage untoward and unexpected events.


Postanesthesia nurses are integral to patient safety. They must ensure the proper functioning of highly technical equipment, perform a detailed postanesthesia assessment on each patient, calculate and administer proper doses of potent medications, monitor multiple patients simultaneously, perceive and understand individualized patient responses to medications and surgical interventions, troubleshoot ambiguous patient conditions, and make complex decisions in times of distress. An extraordinary level of awareness and vigilance throughout the postoperative period is essential to recognize early signs of conditions that could rapidly deteriorate. Patient, surgical, and anesthesia factors can all contribute to critical incidents during recovery. Ongoing studies evaluate the effectiveness of crisis resource management (CRM) training and education for many areas of nursing care, across a wide variety of educational and skill levels.


Definitions


Crisis Resource Management A team’s use of nontechnical skills to prevent and mitigate the effects of crises.


Human Error An action that deviates from the intended action.


Human Factors Factors involved in an adverse event that are attributable to the human element.


Simulation A representation or recreation of a real-world event.


Situation Awareness The ability to perceive and comprehend elements in the work environment.


Introduction


Although routine nursing practice involves a standard set of well-defined skills, the development of complex technical and nontechnical skills is also essential for an effective response to urgent and emergent situations occurring in the PACU. Simulation training is used more and more in many fields of health care to enhance traditional educational methods. Simulation training affords an opportunity for educators to recreate critical, but rare, events. Crisis resource management (CRM) training—which incorporates the basic principles of human factors, domain-specific expertise, and human patient simulators—has the potential to improve the way health care providers respond to and manage emergencies.


Since World War II, the field of aviation has used flight simulators as a safe yet realistic training method for all types of pilots. Investigations of airline disasters have demonstrated that a pilot’s technical skills are not usually the cause of accidents.1 Instead, human factors such as poor teamwork and inadequate communication have been found to be commonly associated with these adverse incidents. In response to this discovery, airline crew team training was born in the 1980s to promote effective collaboration among cockpit and cabin crews, ground personnel, and air traffic controllers. Although not empirically validated, simulation techniques have become a mainstay of aviation training. Pilots train extensively for all emergency procedures in simulated environments to become proficient in crisis management before encountering similar situations on actual flights.


Crisis management training in health care is now possible since the introduction of HPS in the early 1990s. Gaba and colleagues adapted the principles of CRM training for use in the medical domain.2 They found that the principles are as applicable to health care as they are to aviation. Both fields are dynamic, require rapid decision making under stress, and need teams of individuals to work together effectively to prevent loss of life. Critical care medicine, emergency medicine, and trauma teams use simulation technology and CRM training. Although the initial emphasis was on physicians, the technique is now used to educate nurse anesthetists, nurse practitioners, critical care nurses, paramedics, and other allied health personnel. Simulation has also been incorporated into curricula for health care providers and its role continues to expand in education and training to improve health care delivery.


The nature of human error


Everyone makes mistakes. Usually, they are as simple as locking the keys in the car, forgetting to take out the trash, or calling someone by the wrong name. These unintended events, although seemingly significant at the time, pale in comparison to a nurse who accidentally administers the wrong medication or forgets to deliver the proper concentration of oxygen to a patient. In principle, the fundamental human nature of these errors of omission is the same. Nothing is more concerning to a patient than the possibility of becoming a victim of medical error.


Human error is an inevitable part of complex and rapidly changing work domains, such as aviation, anesthesiology, and critical care medicine.3 Dekker agrees that error is a byproduct of normal work because the human mind is not designed to operate perfectly, and all necessary information to make the best decision is often unavailable.4 Human error in any discipline can lead to catastrophic outcomes. Major incidents in any industry, such as the crash of EgyptAir Flight 804 in 2016, gain media interest and prompt public attention and action primarily because of the number of lives altered or lost. Historically, human error–related accidents in health care tend to be less visible to the public, primarily because these events usually affect only one patient at a time.


Theorists in human factors have identified particular circumstances and error types that can help train individuals to recognize the signs of errant problem solving. Although human error can never be eradicated, it can certainly be minimized.5 Aviation, for example, favors teaching error management techniques rather than aiming for human perfection. Numerous organizations are dedicated to improving patient safety by funding research endeavors in this area. One such organization, the Anesthesia Patient Safety Foundation, has funded many studies examining human factors and training in the field of anesthesia. Moreover, the National Patient Safety Foundation has broadened the study of human factors to all medical specialties. Both groups believe that further study and advances in training can improve patient outcomes and safety.


Reason operationalized error into the following three terms: slips, lapses, and mistakes.6 A slip is defined as an error of execution. It is observable and can simply involve the human action of picking up the wrong syringe or turning the wrong knob on an oxygen flowmeter. A lapse is not observable and involves the inability of a person to correctly recall information from memory, such as the mixture of a lidocaine drip. Finally, a mistake is defined as an error in planning rather than an error in execution. Here, a nurse may have planned to actively suction secretions from an endotracheal tube during extubation of a patient. Although the execution was technically correct, the lungs were left devoid of oxygen in the process, which was a mistake in planning.


A common misconception is that errors only happen to lazy, incompetent individuals who lack vigilance. On the contrary, errors can happen to any individual despite vigilance, motivation, and dedication. If lessons are to be learned from errors, blame should not be placed on the individual; rather, a more enlightened view should be embraced to understand the breakdown in the system and the resulting harm to a patient. Two compelling themes surface from human factors research: (1) humans are prone to err, and (2) most errors are not the result of personal inadequacies or carelessness but instead are the product of defects in the design of health care environmental systems in which the work occurs. An illustrative case follows as an example.


Robert, an experienced PACU nurse, was having an unusually busy day after working late into the previous evening. He had been assigned to receive and recover a 42-year-old morbidly obese male patient with a history of obstructive sleep apnea who had just undergone Roux-en-Y gastric bypass surgery. Upon arrival to the PACU, the patient complained of severe pain, rating it a 7 on a 1-to-10 scale. Both Robert and the anesthesia provider noted the patient’s discomfort, and the anesthesia provider administered pain medication. While the anesthesia provider gave a report to Robert, the patient became more comfortable, and his vital signs became more stable. The anesthesia provider then transferred care of the patient to Robert. Robert completed his charting responsibilities, noted his patient was resting comfortably, and turned away from the bedside to assist another nurse who was receiving an incoming patient in the adjacent space. Upon arrival, this patient was combative, vomiting, and having difficulty breathing.


When Robert turned back to check on his patient, he found him unresponsive, cyanotic, not breathing, and without a pulse. How could this have happened with Robert standing just a few feet away? Robert immediately initiated the unit’s rescue protocol and called a Code Blue. Multiple providers responded to the code and provided treatment per Advanced Cardiac Life Support (ACLS) guidelines. After 30 minutes, however, the patient had not responded to resuscitation attempts and was pronounced dead.


A systems approach to understanding error


In the culture of health care delivery, adverse patient outcomes traditionally result in naming, blaming, and shaming the provider involved. When errors are attributed only to the individual, the series of events leading up to the error is overlooked, and an opportunity to correct them is missed. Perhaps the most devastating consequence of this approach to understanding error is that harm is likely to come to another patient because the associated problems are not addressed. Similarly distressing is this approach’s effect on the provider. Wolfe explored both the short- and long-term psychological effects of error on the provider, which include doubt, self-blame, loss of sleep, lack of job confidence, anxiety, embarrassment, and guilt.7 Wolfe also uncovered punitive consequences including probation, suspension, termination of employment, and even criminal prosecution. Today, these issues have developed into the idea of the Second Victim, a term introduced to the literature by Wu in 2000 to describe these negative impacts on those health care providers who have committed such mistakes.8 Unfortunately, despite many years of recognition of these issues, most medical providers continue to “suffer in silence.”9


The systems approach to understanding error involves the consideration of all plausible factors that may have contributed to the event.2 With the systems approach, providers are not excused from their contribution to the accident, but they are not the sole focus, nor are they necessarily punished. The systems approach examines components of the entire system in which work processes occur to see how they may facilitate human error. The systems approach appreciates human error as a symptom of defects in the system. Lack of training, improper equipment maintenance, inadequate staffing, or even illegible handwriting represent components of a system that can individually or jointly contribute to a critical event.


Using a systems approach, a root-cause analysis of the case previously described uncovered multiple system factors that likely contributed to the demise of the patient. First, the type and dose of pain medication administered to the patient by the anesthesia provider were not communicated to Robert. Had this important information been communicated, Robert would have learned that the anesthesia provider administered fentanyl, an opioid nearly 100 times more potent than morphine. Second, the investigation revealed that the alarms on the patient’s monitor had been turned off. Therefore, as the patient’s oxygen saturation, heart rate, and blood pressure decreased, Robert, who had turned his back to help his colleague, was unaware of the patient’s deteriorating condition. Finally, while Robert had intended on turning away from his patient for just a moment, he became distracted with the workload that was required to assist with the care of the patient assigned to his colleague.


Although initially one might question why an experienced anesthesia provider would give such a potent opioid to a patient with sleep apnea in the PACU, in retrospect it is easier to see how he or she could make such a mistake given the immediate availability of the medication (it is likely the fentanyl was left over from the surgical procedure) and the intensity with which the patient expressed his pain. Such a mistake can occur under production pressure. Anesthesia providers are under profound pressure to reduce the time between delivering one patient to the PACU and picking up the next one from the preoperative area to take back to the operating room. This is referred to as “turnover time.” When hurried, it is tempting to choose the quickest solution to a problem. It is less time-consuming to administer an immediately available opioid than it is to obtain one from the medication delivery system, draw it up into a syringe, label it, and administer it. Similarly, when providers are rushed, important information can be missed during the transfer of care from one provider to another. Looking at the system, rather than the individual provider, revealed production pressure as a significant contributor to the patient’s death.


A slip was made when the patient’s monitor alarms were not switched back to “on” after the previous patient in that space was moved to the floor. A system approach to examining such a slip suggested the utilization of monitors whose alarms default to “on” (and must be manually switched to “off”) rather than those that default to “off” (and have to be manually switched to “on”). Also, investigators determined that staff did not have adequate training on the proper operation of the patient monitor. Many times, new equipment, technology, and supplies are introduced to the PACU and other environments on which providers have been inadequately trained.


Finally, a closer look at the system revealed that Robert became distracted while assisting his colleague with her new arrival, leaving his patient unattended for an unacceptable length of time. Distractions include factors that disturb or divert attention from the intended task. Attention given to the distractor increases the risk of slips, lapses, and mistakes due to an increase in stress and a subsequent decrease in cognitive capacity. The newly admitted patient’s combative and deteriorating condition was unexpected given the received report. Additionally, the increase in workload created by this significant distraction left the entire unit relatively understaffed.


CRM training embodies the systems approach to adverse outcome analysis. The systems approach seeks answers from a macro perspective to discover the contributing factors. One of the goals of CRM training is to increase practitioners’ awareness of system factors and to work effectively within the context of a large system that might not always support their efforts. The goal of CRM training is to learn from the mistakes of others through an open exchange of information to lessen the contributions of human factors to an adverse event.


Perianesthesia nurses are at the “sharp end” of the patient encounter; they interface directly with the patient. Many “blunt end” factors, such as the nature of the work, equipment manufacturers, hospital administrators, and other institutional inputs, significantly place nurses at the sharp end. When error occurs, it is prudent to examine all plausible contributing factors.


Crisis management principles


Many approaches, philosophies, and theories of crisis management exist for improving safety in complex industries such as health care. One such approach is explained by the mnemonic ERR WATCH developed by Fletcher to help the practitioner recall the eight essential elements of crisis management10 (Box 4.1). ERR WATCH also serves as a reminder that the goal of crisis management is to reduce the contribution human factors make to errors. Understanding the human factors that shape performance is of prime importance in any crisis management training program. Recognizing the myriad limitations that exist in a health care provider’s ability to quickly and accurately process rapidly changing information during a crisis is also essential. When limitations are identified and well understood, more opportunities can be found to improve performance.


Stay updated, free articles. Join our Telegram channel

May 20, 2023 | Posted by in NURSING | Comments Off on Crisis Resource Management in the PACU

Full access? Get Clinical Tree

Get Clinical Tree app for offline access