Improving Clinical Performance by Analyzing Surgical Skills and Operative Errors



Fig. 32.1
Proportion of cognitive versus technical errors during each step of the procedure




Table 32.1
Details of intraoperative errors on Day 1 and Day 2




















































 
Day 1

Day 2

p-value

Total LVH completion

No. (%) of residents with complete repairs

1/7 (14 %)

7/7 (100 %)

0.001

Total number of errors

121

146
 

Mean (SD) participant errors

17.3 (4.3)

20.9 (5.8)

0.26

Error type

No. (%) of omission errors

40 (33 %)

20 (14 %)

<0.001

No. (%) of commission errors

81 (67 %)

126 (86 %)

Error level

No. (%) of cognitive errors

45 (37 %)

35 (24 %)

0.019

No. (%) of technical errors

76 (63 %)

111 (76 %)


The studies previously discussed demonstrate how broadly errors and surgical performance have been understood. Using multiple methods of investigation (malpractice claims, video-recorded surgical procedures, and simulation), these studies defined errors as incidents in physical skill and technique, failures in procedural understanding, and higher-level issues in judgment and decision-making. The following section will address what these findings mean for the future understanding of surgical performance and surgical assessment as a whole.




Future Directions



Defining “Error” and Understanding Error Management


Humans, across all fields, regardless of their expertise are fallible, yet there is not one consistent definition of surgical errors across the studies discussed. In order to move forward, an error nomenclature needs to be further developed. Evaluating the applicability of errors assessments employed in other fields provides a broad framework for assessing errors in surgery. This will allow for easier methods of comparing across studies and identifying areas of improvement not only for senior surgeons but residents as well. Studies have shown that surgical performance and patient outcomes are related [2, 7780] and also that the operative environments in which surgeons work impact surgical performance in decision-making and technique [8183]. By developing a more concise definition of surgical error, understanding the relationships between errors and patient outcomes and the surgical environment could improve and aid in intervention development to reduce possible disruptions.

While these studies focused on understanding and defining surgical errors, there was little discussion in how residents and senior surgeons compensated for their actions or decisions once an error was committed. Aviation, nuclear power, and various other industries have identified error management as an important, if not critical, skill to have. While the traditional method of surgical education pushes error avoidance, studies have demonstrated that those trained in error management fair better [84]. Incorporating this skill set into future resident training and continuing education for established surgeons may not eliminate the errors committed intraoperatively, but possibly improve their consequences and more importantly patient outcomes [85, 86].

The current assessment methods described previously primarily focus on procedure time and both subjective and objective measures of technical skill. These methods, however, fail to provide a more thorough understanding of the underlying causes and characteristics of surgical performance failures [76, 87]. Incorporating error analysis into future assessment methods may highlight areas for improvement so that surgeons can identify their weaker surgical skills, whether that be in technique or judgment and decision-making, and address them through intentional and deliberate practice [88, 89].


Integrating Technology and Observation-Based Methods


There is promise in some of the newer technologies that are currently in development. Sensor technology has been applied to multiple clinical exams, including the pelvic and breast exams, to assess the role of palpation in performance . Sensor technology demonstrated that differences in palpation force and the technique used plays a role in exam accuracy and proficiency [24, 90]. Pixel-based motion tracking is another promising area that could be used to identify trouble areas or skills for improvement. Pirsiavash and colleagues (2005) have used this method in combination with video-recorded performances to predict performance scores for Olympic athletes [91]. A similar approach could be used in surgery to predict patient outcomes based on surgical performance. Additionally, progress is currently being made to automate the understanding of human behavior [92]. Using methods such as cognitive task analysis, similar research could be performed to automate the understanding of surgical behavior and identification of surgical error. Ultimately, using technology-based assessment methods in complement with observational methods can provide additional understanding in surgical performance that has not yet been addressed.

Regardless of how surgical errors may be defined or what methods we use to assess and analyze performance, without a shift in the culture of the surgical community, we will fail to provide valuable and much needed error-based assessment knowledge to the medical community. In addition, HIPPA laws and regulations must be revisited to allow for non-discoverable use of surgical videos for training and quality assessment. Currently, the evaluation culture within the medical field is marked by a punitive tone, which may continue to prevent broad interest in using assessment technology in the operating room. In medicine and surgery, most of the widely used, standardized assessments such as the licensing and board examinations are competency based. This translates to the use of performance analysis and measurement to identify the minimum standard for which one can practice medicine or perform surgery. In contrast, athletes rely on performance analysis and measurement to set criterion for mastery that in turn drives a positive competitive culture and the desire for optimal performance. If medicine and surgery embarked on a paradigm shift and began to use performance analysis and measurement to drive a positive competitive culture, this would greatly facilitate the attainment of gold standard levels of success, quality, and safety other fields have achieved.


References



1.

Kohn LT, Corrigan JM, Donaldson MS. To err is human: building a safer health system, vol. 6. Washington, DC: National Academies Press; 1999.


2.

Birkmeyer JD, Finks JF, O’Reilly A, Oerline M, Carlin AM, Nunn AR, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369(15):1434–42.CrossRefPubMed


3.

The American Board of Surgery. 2015–2016 ABS Booklet of Information Surgery. 2015. https://​www.​absurgery.​org/​xfer/​BookletofInfo-Surgery.​pdf. Accessed 1 Nov 2015.



5.

Martin JA, Regehr G, Reznick R, Macrae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273–8.CrossRefPubMed


6.

Reznick R, MacRae H. Changes in the wind. N Engl J Med. 2006;355:2664–9.CrossRefPubMed


7.

Moorthy K, Munz Y, Sarker SK, Darzi A. Objective assessment of technical skills in surgery. Br Med J. 2003;327:1032–7.CrossRef


8.

Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1:447–51.CrossRefPubMedPubMedCentral


9.

Regehr G, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73(9):993–7.CrossRefPubMed


10.

Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(97):226–30.CrossRefPubMed


11.

Swift SE, Carter JF. Institution and validation of an observed structured assessment of technical skills (OSATS) for obstetrics and gynecology residents and faculty. Am J Obstet Gynecol. 2006;195:617–21.CrossRefPubMed


12.

Bodle JF, Kaufmann SJ, Bisson D, Nathanson B, Binney DM. Value and face validity of objective structured assessment of technical skills (OSATS) for work based assessment of surgical skills in obstetrics and gynaecology. Med Teach. 2008;30:212–6.CrossRefPubMed


13.

D’Angelo A-LD, Cohen ER, Kwan C, Laufer S, Greenberg C, Greenberg J, et al. Use of decision-based simulations to assess resident readiness for operative independence. Am J Surg. 2015;209(1):132–9.CrossRefPubMed


14.

Hiemstra E. Value of an objective assessment tool in the operating room. Can J Surg. 2011;54:116–22.CrossRefPubMedPubMedCentral


15.

Eubanks TR, Clements RH, Pohl D, Williams N, Schaad DC, Horgan S, et al. An objective scoring system for laparoscopic cholecystectomy. J Am Coll Surg. 1999;189(99):566–74.CrossRefPubMed


16.

van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010;97:972–87.CrossRefPubMed


17.

Larson JL, Williams RG, Ketchum J, Boehler ML, Dunnington GL. Feasibility, reliability and validity of an operative performance rating system for evaluating surgery residents. Surgery. 2005;138:640–9.CrossRefPubMed


18.

Sarker SK, Chang A, Vincent C. Technical and technological skills assessment in laparoscopic surgery. J Soc Laparoendosc Surg. 2006;10:284–92.

Oct 1, 2017 | Posted by in NURSING | Comments Off on Improving Clinical Performance by Analyzing Surgical Skills and Operative Errors

Full access? Get Clinical Tree

Get Clinical Tree app for offline access