Teaching Error

Our reference: TACC 61
P-authorquery-v9
AUTHOR QUERY FORM
Journal: TACC
Please e-mail or fax your responses and any corrections to:
E-mail: [email protected]
Article Number: 61
Fax: +31 2048 52789
Dear Author,
Please check your proof carefully and mark all corrections at the appropriate place in the proof (e.g., by using on-screen
annotation in the PDF file) or compile them in a separate list. Note: if you opt to annotate the file with software other than
Adobe Reader then please also highlight the appropriate place in the PDF file. To ensure fast publication of your paper please
return your corrections within 48 hours.
For correction or revision of any artwork, please consult http://www.elsevier.com/artworkinstructions.
Any queries or remarks that have arisen during the processing of your manuscript are listed below and highlighted by flags in
the proof.
Location
in article
Q1
Query / Remark: Click on the Q link to find the query’s location in text
Please insert your reply or correction at the corresponding line in the proof
Please check the hierarchy of the section headings.
Q2
Please confirm that given names and surnames have been identified correctly.
Thank you for your assistance.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
TACC61_proof ■ 20 February 2012 ■ 1/5
Trends in Anaesthesia and Critical Care xxx (2012) 1e5
Contents lists available at SciVerse ScienceDirect
Trends in Anaesthesia and Critical Care
journal homepage: www.elsevier.com/locate/tacc
REVIEW
Must we get it wrong again? A simple intervention to reduce medical error
Q2
Sophie Vaughan, Tristan Bate, Jonathan Round*
St. George’s Hospital NHS Trust, United Kingdom
s u m m a r y
Keywords:
Error
Decision
Workshop
Education
Bias
Q1
Despite medical error being common and important, healthcare workers understanding of the underlying causes is poor. Error recognition and avoidance is rarely taught on training courses, and systems
designed to investigate or prevent errors in healthcare settings are still developing. Instead of using error
avoidance strategies that were reactive, system focused or academic, we developed an approach to help
practitioners identify when decisions are made during clinical situations, and understand what errors
could potentially be made at each. There are three steps to our approach. First is the identification and
classification of medical errors. Ten basic errors are described which participants then identify within
clinical video or written case scenarios. Next, working in groups and using pre-written scenarios,
participants identify switch points e the moments in a case where key decisions are made e and discuss
the reasons behind each choice. Lastly, still working in groups, participants identify switch points in
difficult cases from their own clinical experience and discuss choices made. This step also rehearses an
open, non-judgemental approach to error. This approach can then be transposed into individual practice
and be used in ward rounds, clinic, theatre, and any situation where clinical decision making occurs. If
incorporated by a department it has the potential to dramatically alter the quality of clinical decisions
and reduce the damaging effect of medical error.
Ó 2011 Elsevier Ltd. All rights reserved.
1. Introduction
The devastating consequences and cost of medical errors has led
to increasing demands for clinicians to better understand their
causes.1 Medical error has been classified into a large number of
biases and faults, and approaches for error prevention are
commonly based on these. Errors may also be classified into either
‘person’ or ‘system’ errors with very different error management
strategies for each.2 System errors occur when adequate safeguards
are not put in place to avoid predictable human error. Best known is
Reason’s “Swiss Cheese” model for system safety in describing this
approach.2
One approach acknowledges that humans will always make
errors and therefore safety depends on creating a system to either
make them impossible (an air hose cannot be put into an oxygen
socket) or anticipate these errors with multiple checks or barriers
to avoid them causing harm (two nurses checking the preparation
of a drug).
* Corresponding author. St George’s, University of London, Department of
Paediatric Intensive Care, Cranmer Terrace, Tooting, London SW17 0RE, United
Kingdom.
E-mail address: [email protected] (J. Round).
There have been several high profile system errors, for example
the intrathecal administration of vincristine or wrong site surgery.3
However effective learning from and the elimination of these errors
has been slow to achieve.4
Other strategies for reducing clinical errors focus on ‘person
errors’, highlighting the importance of cognitive factors, especially
those associated with failures in perception, in heuristics and in
biases.5 The use of a metacognitive, reflective approach to problem
solving allows clinicians to step back from the immediate problem
and reflect on the thinking process, in addition to other cognitive
debiasing strategies.6 The large number and range of cognitive
errors described however makes them difficult to teach, recall or
use in clinical practice.
Despite being a major cause of morbidity and mortality, there is
comparatively little time devoted to medical error avoidance in
undergraduate or postgraduate curricula.7 Most published strategies have concentrated on curriculum development for undergraduates, including lectures, video based teaching and
standardised patients.8e11 Perhaps more for medical error than
other topics, a longitudinal approach is needed, introducing key
concepts to undergraduates and then integrating them into
personal practice after graduation.
We devised a simple intervention designed to reduce medical
errors. Its ease of use allows a facilitator with no specific training to
run a session.
2210-8440/$ e see front matter Ó 2011 Elsevier Ltd. All rights reserved.
doi:10.1016/j.tacc.2012.01.007
Please cite this article in press as: Vaughan S, et al., Must we get it wrong again? A simple intervention to reduce medical error, Trends in
Anaesthesia and Critical Care (2012), doi:10.1016/j.tacc.2012.01.007
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
TACC61_proof ■ 20 February 2012 ■ 2/5
2
S. Vaughan et al. / Trends in Anaesthesia and Critical Care xxx (2012) 1e5
It simplifies recognition of errors, enables participants to critically review their own cases and see when errors might occur
during their work.
2. Teaching medical error avoidance
We run sessions either as a workshop or as a series of standalone short meetings. For each, the setting is important. Participants need to be able to freely speak and listen to each other,
without interruption and without concern that their comments
may be logged or used outside of the session.
Ground rules (Box 1) are set out at the beginning for clarity of
purpose and so all can understand the format.
The facilitator outlines these first before embarking on a three
step process (Fig. 1). Steps one and two are more theoretical, while
step three relates to individual patients.
3. Identifying and understanding errors e the deadly ten
3.1. Understanding the errors
To simplify understanding and remember causes of error,
participants are introduced to ‘The deadly ten’ (Box 2). In a workshop, this can be done with a short talk. Briefly, these errors are
described below.
3.1.1. Sloth
This is defined as not doing what you should be doing because of
the effort required or (perceived) inadequate reward. It includes not
checking results, not asking for adequate information, not reviewing a patient. It leads to poor documentation and inadequate
attention to detail. Often system errors such as poor staffing or
onerous rotas may augment individual sloth.
3.1.2. Fixation
This error occurs when a particular diagnosis or analysis is
firmly held onto, despite evidence against it. ‘Anchoring’ is
a cognitive bias where features in the initial presentation are
overvalued above subsequent information.6 Similarly ‘Confirmation
bias’ describes a preference for evidence supporting our beliefs over
that opposing them. These short cut biases may save time and
effort12 but lead to decisions being taken before all the available
evidence is considered.
3.1.3. Communication breakdown
Often, within a team, there is sufficient information to manage
the patient successfully. However if this information is not released
or does not reach its destination at the right time, there is
communication breakdown. So unclear instructions, poor listening
and not considering other opinions leads to poor communication
and increases risk of error.
Fig. 1. The three step process.
3.1.4. Poor team working
Good team working prevents errors, allowing each team
member to operate within their abilities, with sufficient time to
discharge their duties. By contrast, if a team attempts to work with
some members out of their depth and others underutilised, or with
some having too much to do, then the team will be ineffective and
inefficient. The team leader will need to be aware of team skills and
the tasks required, to delegate tasks and monitor the function of the
team. Lessons learnt from the aviation industry and motor racing
have helped develop an engineering systems approach to health
care13 with training improving patient safety-related behaviour in
hospitals.14,15
3.1.5. Playing the odds
Clinicians typically favour benign diagnoses over more serious
ones, perhaps because they would not wish a more serious problem
on their patient. They also tend to discount rare disorders, even
when clinical features suggest that the rare disorder is present.
Failing to understand or remember the fundamental rules of
probability can also cause significant errors.6
3.1.6. Bravado
Bravado leads to error when clinicians work beyond their
competence or without adequate supervision. By contrast Timidity
is an error where a clinician fails to take on a task they are trained
for and positioned to do. Bravado is more common in groups where
there is a strong hierarchical structure and little questioning.16
3.1.7. Ignorance
Simply a lack of knowledge, it also comprises Unconscious
Incompetence, where one is not aware of the ignorance.17,18
Ignorance may not only affect our actual knowledge, but also our
consideration of factors which may affect our judgement, such as
fatigue or stress.
Box 2. The deadly ten errors.
Box 1. Ground rules.
Acknowledge that
We are all prone to medical error
Understanding error helps prevent future errors
Medical error is everyone’s responsibility
Comments and opinions should remain confidential
unless patient safety would be at risk
Be honest about what you did or would do
Listen to others to understand their viewpoints
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Sloth
Fixation
Communication breakdown
Poor team working
Playing the odds
Bravado
Ignorance
Mis-triage
Lack of skill
System error
Please cite this article in press as: Vaughan S, et al., Must we get it wrong again? A simple intervention to reduce medical error, Trends in
Anaesthesia and Critical Care (2012), doi:10.1016/j.tacc.2012.01.007
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
TACC61_proof ■ 20 February 2012 ■ 3/5
S. Vaughan et al. / Trends in Anaesthesia and Critical Care xxx (2012) 1e5
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
3
3.1.8. Mis-triage
Real world medical systems typically involve multiple issues
that have to be prioritised and reconsidered regularly. Over- or
underestimation of the seriousness of a situation and lack of prioritisation occurs at many levels from a triage nurse in the Emergency Department to allocation of resources. Once wrongly
estimated, it sets the pace and tone of subsequent interventions.
3.1.9. Lack of skill
A lack of skill can reflect lack of teaching or lack of practice, such
as performing an operation only rarely. Simulation and skills
training can offer the opportunity to teach and practice these skills
in a safe environment.19
3.1.10. System error
Healthcare systems should be designed to minimise the risk of
error, conscious of how humans will always make mistakes if there
are enough choices to be made. A poorly designed system exposes
staff to unnecessary decision making steps, confuses the decision
maker with multiple distractions, allows dangerous choices to be
easily made and lacks checks and safeguards to identify errors.
System error is a complex area, covering environmental,
equipment or organisational failures.4
3.2. Identifying the errors
After an overview of these ten errors, participants learn to
identify the errors in other people’s scenarios. An effective way to
do this in a workshop setting is ‘medical error bingo’. Clinicians are
given bingo cards containing the 10 errors which they then have to
spot while watching short video clips chosen from popular medical
TV dramas. (Fig. 2). An alternative is to use pre-prepared written
scenarios.
4. Switch points e understanding decision making
Once practitioners are able to categorise and understand types
of errors, the next step is to learn about the structure of a clinical
case. Decisions are not made uniformly during cases, but at specific
times, with ‘doing’ sections separating ‘deciding’ periods. We have
termed the deciding times ‘switch points’.
Decisions made at each of these points affect the clinical course,
diagnosis and outcome. The type and timing of switch points will
vary depending on the clinician, their workplace and their speciality, but common ones are shown in Fig. 3.
By identifying switch points and using a metacognitive
approach to stop and analyse the thinking process and examine for
likely cognitive biases, clinicians can detect and understand why
errors can occur.
In a workshop we use pre-written clinical scenarios with poor
outcomes. Scenarios can be obtained from our error website (www.
jround.co.uk/error). The participants are divided into groups of
4e6, each with a flip chart. Groups map out the key steps in the prewritten scenario and identify switch points. (Fig. 4) Participants
then discuss cognitive processes at switch points and explore why
people might make poor choices at these points. The groups come
together to share what they have learnt and to enable the facilitator
to correct any misunderstandings.
Fig. 2. Bingo card for medical error bingo.
way to start is to analyse their own cases retrospectively in a group.
This step makes learning more personal. It also encourages
collective ownership of safety and error, working to reduce the
stigma of making mistakes.
With a facilitator supporting the group, a participant chooses
and presents a case which did not go well, from his or her own
practice. Again using a flip chart, the clinical course is outlined and
switch points are identified. The group discusses the case and is
encouraged to consider different decisions that might have been
made at each switch point. Effort is made to understand the reasons
behind each of these decisions. Where errors were made, the
facilitator invites the group to consider how they could be prevented in future.
At the end of this step, participants should be able to see how,
within a real case, there are just a few key moments in which
decisions are made, and that these choices can have a profound
impact on the outcome. They will also understand more of why
particular decisions are made, the types of errors that occur and
how practice can be improved.
5. Integrating into clinical practice
Having learnt how to identify errors and switch points in
scenarios, the next step is to relate this to individual practice.
Ultimately the aim is for clinicians to be able to do this on their own
as they see their own patients prospectively. However, an effective
Fig. 3. Timing of common switch points.
Please cite this article in press as: Vaughan S, et al., Must we get it wrong again? A simple intervention to reduce medical error, Trends in
Anaesthesia and Critical Care (2012), doi:10.1016/j.tacc.2012.01.007
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
TACC61_proof ■ 20 February 2012 ■ 4/5
4
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
S. Vaughan et al. / Trends in Anaesthesia and Critical Care xxx (2012) 1e5
Fig. 4. Example of the key steps in a case, taken from a workshop.
Groups then feed back to the workshop and the facilitator draws
key messages that all can take into their own clinical practice.
5.1. Prospective integration
The steps above can all be completed in a series of teaching
sessions or a workshop, focussing on errors that have already
happened. However, the key purpose of these sessions is to avoid
errors occurring prospectively. This teaching strategy is designed to
help practitioners understand how errors can happen, that errors
happen at specific points in clinical work, and that there is a limited
range of errors that can occur. With the knowledge and techniques
learnt in the sessions, clinicians are better equipped to identify
switch points as they are occurring, and are better able to stop and
consider what errors might develop in a specific situation.
The ways in which this approach is integrated into individual
practice depends on the specifics of each person’s workplace and
speciality. It is important that participants consider how they
should to do this. Some examples are shown in Box 3.
6. Discussion
Despite being aware of the dangers inherent in medical
processes, healthcare practitioners continue to make decisions that
cause harm to patients. There is now a substantial and developing
literature on medical error that highlights both the cognitive
reasons behind poor decisions and the defences that healthcare
systems can develop to minimise error. One of the remaining
challenges for medical teachers is to instill protective cognitive
strategies and error awareness into healthcare workers so that
fewer poor decisions are made.20
In this paper we have presented an approach for teaching the
basics of medical error. Either in a workshop or series of seminars,
participants learn how to identify errors in scenarios, and to
recognise them during an ongoing clinical case.
Our approach uses adult learning theory,21 which emphasises
that learners need to see the immediate relevance of the subject,
that they should use their experience as a basis for learning and
that teaching should focus on problem solving rather than content.
Box 3. Integration strategies.
Weekly ‘difficult patient’ case review sessions
Extended handover meeting
Academic ward round
Clinical governance meetings
Simulation sessions
Please cite this article in press as: Vaughan S, et al., Must we get it wrong again? A simple intervention to reduce medical error, Trends in
Anaesthesia and Critical Care (2012), doi:10.1016/j.tacc.2012.01.007
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
TACC61_proof ■ 20 February 2012 ■ 5/5
S. Vaughan et al. / Trends in Anaesthesia and Critical Care xxx (2012) 1e5
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
This makes our workshops more suited to practitioners rather than
students.
A major challenge facing medical team leaders is the integration
of error prevention into regular departmental practice. Many
hospitals use a reactive approach e exhaustively examining errors
that have already occurred to develop ‘systems’ to prevent recurrence. Others put error prevention into a clinical governance box,
where it fails to properly impact day to day practice.
For an area as important as error prevention, a multifaceted,
integrated and continuous approach is needed. Following a workshop introduction, weekly meetings can be arranged. These would
look at difficult cases in a non-judgemental way, examining the
cognitive processes, and extrapolating ‘take home messages’.
This approach can be integrated into individual decision making
and be used on rounds, in clinic, in theatre, and wherever decision
making occurs. Once incorporated by a department it can
dramatically improve the quality of clinical decisions and reduce
the devastating effect of medical error.
Conflict of interest statement
None.
Acknowledgements
The authors would like to thank the trainees at St. George’s
Hospital for their contribution to the development of the Error
Workshops and Jane Round for her valuable comments on this
article.
References
1. Raine JE. An analysis of successful litigation claims in children in England. Arch
Dis Child 2011 Sep;96(9):838e40.
2. Reason J. Human error: models and management. BMJ 2000 March
18;320(7237):768e70.
5
3. Kwaan MR, Studdert DM, Zinner MJ, Gawande AA. Incidence, patterns, and
prevention of wrong-site surgery. Arch Surg Apr 2006;141(4):353e7 [discussion 357-8].
4. Noble DJ, Donaldson LJ. The quest to eliminate intrathecal vincristine errors:
a 40-year journey. Qual Saf Health Care 2010 August 01;19(4):323e6.
5. Croskerry P. Achieving quality in clinical decision making: cognitive strategies
and detection of bias. Acad Emerg Med 2002 Nov;9(11):1184e204.
6. Croskerry P. The importance of cognitive errors in diagnosis and strategies to
minimize them. Acad Med 2003 Aug;78(8):775e80.
7. Kohn Linda T, Corrigan Janet, Donaldson Molla S. To err is human: building
a safer health system, vol 6. National Academies Press; 2000.
8. Gunderson AJ, Smith KM, Mayer DB, McDonald T, Centomani N. Teaching
medical students the art of medical error full disclosure: evaluation of a new
curriculum. Teach Learn Med 2009 Jul;21(3):229e32.
9. Halbach JL, Sullivan LL. Teaching medical students about medical errors and
patient safety: evaluation of a required curriculum. Acad Med 2005
Jun;80(6):600e6.
10. Patey R, Flin R, Cuthbertson BH, MacDonald L, Mearns K, Cleland J, et al. Patient
safety: helping medical students understand error in healthcare. Qual Saf
Health Care 2007 Aug;16(4):256e9.
11. Leung GK, Patil NG, Ip MS. Introducing patient safety to undergraduate medical
students e a pilot program delivered by health care administrators. Med Teach
2010;32(12):e547e51.
12. Pines JM. Profiles in patient safety: confirmation bias in emergency medicine.
Acad Emerg Med 2006 Jan;13(1):90e4.
13. Powell SM. Creating a systems approach to patient safety through better
teamwork. Biomed Instrum Technol 2006 MayeJun;40(3):205e7.
14. Sax HC, Browne P, Mayewski RJ, Panzer RJ, Hittner KC, Burke RL, et al. Can
aviation-based team training elicit sustainable behavioral change? Arch Surg
2009 Dec;144(12):1133e7.
15. Varkey P, Karlapudi SP, Bennet KE. Teaching quality improvement: a collaboration project between medicine and engineering. Am J Med Qual 2008 JulAug;23(4):296e301.
16. Learning from Bristol: the report of the public inquiry into children’s heart surgery
at the Bristol royal infirmary 1984e1995. Bristol Royal Infirmary Inquiry; July
2001.
17. Dunning D, Johnson K, Ehrlinger J, Kruger J. Why people fail to recognize their
own incompetence. Curr Dir Psychol Sci 2003 June 01;12(3):83e7.
18. Higginson I, Hicks A. Unconscious incompetence and the foundation years.
Emerg Med J 2006 Nov;23(11):887.
19. Roberts KE, Bell RL, Duffy AJ. Evolution of surgical skills training. World J
Gastroenterol 2006 May 28;12(20):3219e24.
20. Croskerry P. A universal model of diagnostic reasoning. Acad Med 2009
August;84(8):1022e8.
21. Knowles M. The modern practice of adult education: from pedagogy to andragogy
[Further information and resources are available to download at:]. Wilton,
Connecticut: Association Press, www.jround.co.uk/error; 1980.
Please cite this article in press as: Vaughan S, et al., Must we get it wrong again? A simple intervention to reduce medical error, Trends in
Anaesthesia and Critical Care (2012), doi:10.1016/j.tacc.2012.01.007
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576