Crisis resource management in the PACU

4 Crisis resource management in the PACU



Perianesthesia nurses perform a vast proportion of their work in a complex environment where keen awareness of each patient’s situation and a high level of vigilance throughout the perianesthesia period are essential to ensure positive patient outcomes. Postanesthesia care units (PACUs) are error-prone environments where opportunities for egregious mistakes are inherent because of high cognitive burdens and stress loads, high noise levels, demands on attention, and time pressures. Each day, perianesthesia nurses must ensure the proper functioning of highly technical equipment, perform a detailed postanesthesia assessment on each patient, calculate and administer proper doses of potent medications, monitor multiple patients simultaneously, perceive and understand individualized patient responses to medications and surgical interventions, troubleshoot ambiguous patient conditions, make complex decisions under times of distress, and respond appropriately and accurately under production pressure. Patient, surgical, and anesthesia factors can all contribute to critical incidents during the perianesthesia period.


The dynamic nature of delivering care and a recent explosion of technology affect the educational needs of perianesthesia nurses. Technologic advances such as electronic charting, computer monitoring systems, and complex procedural equipment have altered the skills necessary to care for patients immediately after surgery. Given the high workload and an environment rich with distractions, perianesthesia nurses must anticipate and be quick to respond to critical situations in the PACU. Academic programs responsible for preparing adept perianesthesia nurses are continuously challenged because of a limited number of actual emergencies in the PACU during student clinical training.


Although routine nursing practice involves a set of basic skills, complex technical and nontechnical skills are essential for an effective response to urgent and emergent situations occurring in the PACU. Technologic advancements in human simulation are currently used in many fields of health care to enhance traditional educational methods by allowing an opportunity for educators to recreate critical, but rare, events. Crisis resource management (CRM) training—which incorporates the basic principles of human factors design, domain-specific expertise, and human patient simulators—has the potential, in theory, to improve the way health care providers respond to and manage emergencies.


Since World War II, the field of aviation has used flight simulators as a safe yet realistic training method for all types of pilots. Investigations of airline disasters have demonstrated that a pilot’s technical skills are not usually the cause of accidents.1 Instead, poor teamwork and inadequate communication were found to be commonly associated with these adverse incidents. In response to this discovery, airline crew team training was born in the 1980s to promote effective collaboration among cockpit and cabin crews, ground personnel, and air traffic controllers. Although the practice has never been validated empirically, simulation techniques have become the mainstay of aviation training. Pilots train extensively in all emergency procedures in simulated environments to become proficient in crisis management before encountering similar situations on actual flights.


Training in health care is now possible with the introduction of full-sized human patient simulators in the early 1990s. Gaba and colleagues, at Stanford University in Palo Alto, California, adapted the principles of CRM training to the medical domain.2 They found that the principles were as applicable to health care as they were to aviation. Both fields are characterized as dynamic, necessitate rapid decision making under stress, and require teams of individuals to work together effectively to prevent loss of life. Critical care medicine, emergency medicine, and trauma teams use simulation technology and CRM training. Although the initial emphasis was to educate physicians, the technique is now used to educate nurse anesthetists, nurse practitioners, critical care nurses, paramedics, and other allied health personnel. Simulation has also been incorporated into many curricula for health care providers and continues to expand its role in education and training to improve health care delivery.



Human factors training and the systems approach to reducing medical error


Everyone has made a mistake such as locking his or her keys in the car or calling someone by the wrong name. These unintended events, although seemingly significant at the time, pale in comparison to a nurse who accidentally administers the wrong medication or forgets to deliver the proper concentration of oxygen to a patient. In principle, the fundamental human nature of these errors of omission is the same. Nothing is more concerning to a patient than the possibility of becoming a victim of medical error. CRM training addresses the management of critical events in health care with a strong emphasis on human factors. The primary focus is on improvement of human performance in complex work environments to facilitate better decision making under stress, more effective teamwork, and improved patient outcomes.


Human error is an inevitable part of complex and rapidly changing work domains, such as aviation, anesthesiology, and critical care medicine.3 Human error in any discipline can lead to catastrophic outcomes. Major incidents in any industry, such as the crash of the Concorde jet in 2000, gain media interest and prompt public attention and action primarily because of the drama and scope of the event in terms of lives altered or lost. Until recently, human error–related accidents in health care tended to be less visible to the public, primarily because these events usually affect one patient at a time.


Theorists in human factors have identified particular circumstances and error types that can help to train individuals to recognize the signs of errant problem solving. Although human error can never be eradicated, it can certainly be minimized.4 Aviation, for example, favors teaching error management techniques rather than an aiming for human perfection. Numerous organizations are dedicated to improving patient safety by funding research endeavors in this area. One such organization, the Anesthesia Patient Safety Foundation, has funded many studies examining human factors and training in the field of anesthesia. Moreover, the National Patient Safety Foundation has broadened the study of human factors to all medical specialties. Both groups believe that further study and advances in training can improve patient outcomes and safety.


Reason5 operationalized error into the following three terms: slips, lapses, and mistakes. A slip is defined as an error of execution. It is observable and can simply involve the human action of picking up the wrong syringe or turning the wrong knob on an oxygen flowmeter. A lapse is not observable, but involves the inability of a person to correctly recall information from memory, such as the mixture of a lidocaine drip. Finally, a mistake is defined as an error in planning rather than an error in execution. Here, a nurse may have planned to actively suction secretions from an endotracheal tube during extubation of a patient. Although the execution was technically correct, the lungs were left devoid of oxygen in the process, which was a mistake in planning.


A common misconception is that errors only happen to lazy incompetent individuals who lack vigilance. On the contrary, errors can happen to any individual despite vigilance, motivation, and dedication. When errors occur, blame should not be placed on the individual; rather, a more enlightened view should be embraced—to understand the breakdown in the system and the resulting harm to a patient. Two compelling themes surface from human factors research: (1) humans are prone to err and (2) most errors are not the result of personal inadequacies or carelessness, but instead are the product of defects in the design of health care environmental systems in which the work occurs. An illustrative case follows.


Sarah, an experienced PACU nurse, was well into her double shift by the time the patient arrived in the unit at 2:00 AM. A 36-year-old woman involved in a motor vehicle crash had just undergone an exploratory laparotomy and splenectomy for intraabdominal bleeding. Thirty minutes after the patient’s arrival, an alarm sounded. Sarah noted that the patient’s heart rate was 36 beats/min and dropping. Following unit protocol, Sarah quickly reached into the medication cart for atropine and intravenously administered a 0.4-mg dose. Almost instantly, the patient’s blood pressure soared to 300 mm systolic on the arterial line monitor and the patient went into cardiac arrest. Despite full resuscitative efforts, the patient did not respond.


Later, as Sarah was cleaning up the bedside stand of all the medications used in the code, she found an empty phenylephrine vial. Immediately, Sarah realized that she had inadvertently given the patient a 10-mg bolus of phenylephrine instead of the intended dose of atropine.


A follow-up root cause investigation discovered that the pharmacy had recently stocked phenylephrine next to atropine in the medication drawer. Both the drugs were manufactured by the same company and came in the same sized vials with the same color snap-off caps. The label for atropine was a light red color, but the phenylephrine label was pink. Instead of placing the blame on Sarah, the suggestion was made that pharmacy immediately tag the vials with a black colored A atop the atropine and physically separate the two drugs from one another in the medication cart. The manufacturer was also notified and encouraged to change the labeling system.


Although initially one might question how a nurse could misread or choose not to read the label and give a wrong medication, in retrospect it is easy to see how an experienced nurse could slip while emergently reaching for a medication under conditions of high stress. Such a slip is analogous to a common error among anesthesia providers concerning gas flow meters. At one time, anesthesia gas delivery systems had two similar gas control knobs: one to deliver oxygen and one to deliver nitrous oxide. Slips occurred when anesthesia providers inadvertently turned up the nitrous oxide when they had intended to turn up the oxygen, which resulted in a hypoxic gas mixture being delivered to patients. A human factors approach was chosen to remedy this problem. The oxygen knob was redesigned with deep palpable indentations, whereas the nitrous oxide knob remained smooth. The anesthesia provider then was able to tell by touch alone which knob was in hand. The anesthesia machine was also given a built-in fail-safe mechanism that did not allow the delivery of a hypoxic mixture regardless of how high the nitrous flow was set. This approach to the problem effectively prevented harm to the patient by a hypoxic mixture of gases. Accidents and accident reporting were viewed in these examples as opportunities to design more robust systems to prevent the same type of injury from ever occurring again.


Traditionally, an adverse outcome results in blaming the particular caregiver; however, careful study of the entire system in which the incident occurred usually uncovers multiple factors that contributed to the event.2 Lack of training, improper equipment maintenance, poor staffing, or an illegible order transcribed incorrectly can individually or jointly contribute to a critical event. In other words, a cascade of events rather than a single event, often results in an adverse outcome. CRM training advocates the systems approach to adverse outcome analysis. The systems approach seeks answers from a macro perspective to discover the contributing factors. A look at policies and administrative decisions that either supported or derailed a critical incident is a radical departure from the traditional “frame and blame” punitive approach used in medicine. This approach should not be interpreted as lessening the responsibility of the person who made an error, but as gaining a better understanding of why the error occurred; only then can the system be adjusted to better prevent reoccurrence. CRM training strives to make practitioners aware of systemic factors and to work effectively within the context of a large system that might not always support their efforts. The goal of CRM training is to learn from the mistakes of others through an open exchange of information to lessen the contributions of human factors to an adverse event. In this way, students can come to understand the cascade of events that lead to mistakes in a certain situation.


Perianesthesia nurses are at the “sharp end” of the patient encounter; they interface directly with the patient. Many “blunt end” factors, such as the nature of the work, equipment manufacturers, hospital administrations, and other institutional effects, significantly contribute to placing these nurses at that sharp end. When error occurs, it is prudent to examine all causative factors.


Nov 6, 2016 | Posted by in NURSING | Comments Off on Crisis resource management in the PACU

Full access? Get Clinical Tree

Get Clinical Tree app for offline access