The practice of debriefing is extremely important across healthcare simulation to ensure that learners have the opportunity to re-examine their work. As clinical simulation serves as a way for learners to gain first-hand experience without posing risk to real-life patients, the ability to learn from their mistakes is integral to successful outcomes. To help learners better identify gaps in their performance and strategies for improvement, the Imperial College London’s NIHR Patient Safety Translational Research Center developed the Objective Structured Assessment for Debriefing tool (OSAD). This HealthySimulation.com article shares more about the tool and the importance of healthcare simulation debriefing, especially in surgery.

According to the Imperial College London, empirical evidence on debriefing in surgery is surprisingly sparse. Despite being “extensively employed by other safety-critical industries,” surgery training has come to rely more so on expert opinion rather than primary evidence on debriefing. The institution adds that this inability to measure debriefing quality has also “hindered the provision of formative feedback to trainers (i.e. the debriefers) on how to improve their debriefing practices.”

To help improve feedback and debriefing practices across surgery education, the establishment of OSAD has created a means for healthcare simulation educators to better address surgery debriefing following learning experiences. To achieve this outcome, the assessment tool was based on the synthesis of the evidence base and expert opinion and includes eight core elements of a high-quality performance debrief – Approach, Learning Environment, Learner Engagement, Reaction, Reflection, Analysis, Diagnosis, and Application.

For each element, a 5-point behaviourally anchored rating scale is provided. This is so that a debriefing can be evaluated across all eight parameters and feedback can be provided to the debriefer aimed at improving the quality of the debrief.

OSAD Instructions for Use

  1. You are observing and rating the facilitator in their ability to conduct a debrief (NOT the learner)
  2. Please read the entire rating form before starting the debriefing session to ensure you observe the facilitator’s behaviors that you are scoring.
  3. There are 8 categories (see definitions below), for which you score the facilitator on a scale of 1 (done very poorly) to 5 (done very well)
  4. To help you score, there are descriptions for the observable behaviors for scores 1,3, and 5. If you decide they rate in between these, score them 2 or 4 accordingly.
  5. Definitions and examples of some of these behaviors are given below to guide your scoring.
  6. For the purposes of a simulation course, category 2 may only be addressed at the start of the first debrief, and therefore it is not appropriate to score this category again for all subsequent debriefs.
  7. For group debriefings, it is important that the facilitator involves all participants in order to score 5, and therefore these descriptions refer to “learner(s).

Definitions and Exemplar Behaviors

Approach: Manner in which the facilitator conducts the debriefing session, their level of enthusiasm and positivity when appropriate, showing interest in the learners by establishing and maintaining rapport and finishing the session on an upbeat note.

Establishes Learning Environment: Introduction of the simulation/learning session to the learner(s) by clarifying what is expected of them during the debriefing, emphasizing ground rules of confidentiality and respect for others, and encouraging the learners to identify their own learning objectives.

Engagement of the Learners: Active involvement of all learners in the debriefing discussions, by asking open questions to explore their thinking and using silence to encourage their input, without the facilitator talking for most of the debriefing, to ensure that deep rather than surface learning.

Descriptive Reflection: Self-reflection of events that occurred in the simulation/learning session in a step-by-step factual manner, clarifying any technical clinical issues at the start, to allow ongoing reflection from all learners throughout the analysis and application phases, linking to previous experiences.

Reaction: Establishing how the simulation/learning session impacted emotionally on the learners.

Analysis: Eliciting the thought processes that drove a learner’s actions, using specific examples of observable behaviors, to allow the learner to make sense of the simulation/learning session events.

Diagnosis: Enabling the learner to identify their performance gaps and strategies for improvement, targeting only behaviors that can be changed, and thus providing structured and objective feedback on the simulation/learning session.

Application: summary of the learning points and strategies for improvement that have been identified by the learner(s) during the debrief and how these could be applied to change their future clinical practice.

More About Imperial College London

The Imperial College London Faculty of Medicine was established in August 1997 through a combination of major west London medical schools. The oldest of these, the Westminster, dates back to the eighteenth century when the system of medical training was not as formalized as reflected in modern times.

Today, the institution’s vision for medicine at Imperial is to attract the world’s brightest academics and teach the world’s most able learners in modern, advanced technical and physical environments. Medical science now enjoys a distinctive environment at Imperial. Imperial College London now reaps extraordinary benefits from being well embedded within a wider ecosystem of excellent fundamental science at the College and working translationally with an AHSC network of NHS partners.

The Faculty of Medicine aims to foster the most effective academic, clinical research, and education environments possible, across its campuses. Alongside its partners, the faculty will enable and support the continued evolution of Imperial as an international powerhouse of medical education, research, and innovation.

More About Debriefing

Healthcare simulation debriefing is a period of time following an experiential learning activity during which learners/teams reflect, review and discuss the activity with the goal of improving individual and team clinical skills and judgment. Following each scenario, a debriefing is conducted by one or more people such as a healthcare simulation facilitator (considered to be a content expert on the scenario subject matter). These content experts should also be skilled in debriefing, as many would argue that debriefing is the most important component of a simulation experience.

Several factors affect the nature of a simulation in healthcare debriefing. These factors include the objectives of the medical simulation, complexity of the scenario, experience level of the learners, familiarity of learners with the sim environment, time available for the session, audiovisual recording systems, and individual personalities of participants. Creating a safe learning space is a critical consideration since participating in simulations can have a significant emotional impact on learners, which should begin with the orientation and again in the prebriefing.

Opening debriefing questions often include basic “what” or “how” questions. These questions are open-ended and should always be non-judgmental. Participants should be encouraged and made to feel that their contributions are valued. Faculty often reflect back on learner statements to reiterate points or to open up a discussion. Debriefing should occur immediately following clinical simulation. Note, debriefing is all about the learners who should do most of the talking, and not about the educator. Healthcare simulation and debriefing are used extensively to improve team communication, dynamics, and efficiency.

Learn More About the OSAD Tool

References:

Arora S, Ahmed M, Paige J, Nestel D, Runnacles J, Hull L, Darzi A, Sevdalis N. Objective Structured Assessment of Debriefing (OSAD): Bringing science to the art of debriefing in surgery. Annals of Surgery 2011 (in press)

Ahmed M, Sevdalis N, Paige J, Paragi-Gururaja R, Nestel D, Arora S. Identifying best practice guidelines for debriefing in surgery: A tri-continental study. American Journal of Surgery 2011 (in press)

Lance Baily Avatar
BA, EMT-B
Founder / CEO
Lance Baily, BA, EMT-B, is the Founder / CEO of HealthySimulation.com, which he started in 2010 while serving as the Director of the Nevada System of Higher Education’s Clinical Simulation Center of Las Vegas. Lance also founded SimGHOSTS.org, the world’s only non-profit organization dedicated to supporting professionals operating healthcare simulation technologies. His co-edited Book: “Comprehensive Healthcare Simulation: Operations, Technology, and Innovative Practice” is cited as a key source for professional certification in the industry. Lance’s background also includes serving as a Simulation Technology Specialist for the LA Community College District, EMS fire fighting, Hollywood movie production, rescue diving, and global travel. He and his wife live with their two brilliant daughters and one crazy dachshund in Las Vegas, Nevada.