Simulation-Based Learning: Bring It Home with Active Experimentation

Simulation-Based Learning: Bring It Home with Active Experimentation

Simulation educators understand the important role experience plays in the learning process. They rely on the experiential learning process inherent to simulation-based learning activities to help participants learn specific content matter and gain insight into their own learning styles and processes. Kolb’s Experiential Learning Theory (ELT) provides the conceptual framework for many healthcare simulation programs. ELT is a four-step cycle that includes a concrete experience (CE), reflective observation (RO), abstract conceptualization (AC) and active experimentation (AE). This model is cyclical in nature, with no specific beginning or ending. Learning occurs throughout all four stages and the theory purports that students will experience differing takeaways from each stage (Kolb et al., 2014).

Unfortunately, the active experimentation step is often omitted in healthcare simulation. This omission leaves learning opportunities on the table. If active experimentation is not part of the simulation session, participants will be left to their own devices as they explore the learning that occurred in the other three phases. This means students will experiment or try out their new knowledge in the clinical setting, often without the benefit of instructor feedback.

Experiential learning requires an intention to learn. It is defined as the process of “learning by doing” and includes key elements of action and reflection. It is an iterative process that continues to build knowledge with further experience and reflection. Two phases of the cycle – concrete experience and abstract conceptualization – involve grasping an experience or taking in information.

The other two phases – reflective observation and active experimentation – focus on transforming the experience or extending the learning beyond the simulation lab. The reflective phase of the cycle is an important part of the experiential learning process. Skilled facilitators help learners explore their responses and reactions to the activity, thereby identifying intrinsic motivation and self-knowledge (Kolb et al., 2014).

Providing feedback during the reflection process allows learners to explore changes in judgment, better understand the material presented and achieve greater retention of the information. The concrete experience and reflective observation are central components of healthcare simulation. Reflection is a process of problem solving and reconstruction of meaning that leads to new knowledge based on a conceptual change.

During the debriefing session, abstract conceptualization leads to new theories that need to be tested. If the active experimentation phase of Kolb’s cycle does not occur in simulation, the knowledge acquired from the three other phases will need to be explored in another arena, or risk being lost (Chmil et al., 2015). Relegating active experimentation to the clinical environment is unfair to learners and the patients they serve.

Kolb’s theory of experiential learning is optimized when participants touch all four bases: they experience the simulation as either active participants or observers; they reflect on the experience both individually and as part of the group debrief; they continue to think about the experience and generate new ideas based on the experience; and ultimately, they actively test these newly formulated hypotheses in future situations.

Kolb (2015) maintains learners can enter the cycle at any phase and effective learning is best achieved by allowing the learner to participate in the entire cycle. Forfeiting the active experimentation phase denies learners the opportunity to employ the new knowledge in a similar situation. The insights gained from the simulation scenario and debriefing session languish until another occasion for application is presented.

Active experimentation can take several forms in healthcare simulation; a few recent research articles describe various approaches. Following the debriefing session, Maguire & White (2021) had participants immediately repeat the same simulation scenario. Learners could apply the knowledge gained from their earlier experience to improve performance and correct errors. Results from this study identified growth in all three of the learning domains – cognitive, affective and psychomotor. Students expressed increased confidence in their abilities after the second run of the scenario. The immediate, repetitive experiential practice helped reinforce learning and provided students with the opportunity “for redemption.” Participants realized increased satisfaction when they were able to amend behaviors and perform at a higher level.

Another team of researchers looked at learners assuming different roles in the repeated scenario. Participants in the observer role moved into an active role and applied the knowledge they acquired during the first run of the scenario and the subsequent debriefing session. Zulkosky and colleagues (2021) found that nursing students demonstrated higher levels of clinical knowledge when they repeated the same scenario and this increase in learning occurred regardless of the student’s role as an observer or an active participant.

The authors recommend simulation educators consider decreasing the number of different scenarios presented to learners and let students repeat scenarios for greater gains in clinical competency. Active experimentation can also be achieved by offering similar rather than identical simulation scenarios. Learners apply the knowledge obtained from participating in the first scenario and debriefing session to a comparable patient profile and clinical scene. Having this happen in simulation, adds another layer of protection for patients in the clinical setting.

Rather than repeating the simulation scenario, a nursing school in Australia provided an unstaffed “redo station” for students to visit following a simulation scenario and debriefing session. Student requests for “another go” after debriefing, served as the catalyst for this intervention. This approach to active experimentation mirrors deliberate practice, a planned and purposeful method of practice requiring focused attention. Deliberate practice is conducted with the specific goal of improving performance and is commonly associated with psychomotor skill attainment. Sivertsen and her team (2016) brought this targeted practice approach to scenario-based simulation.

This allowed active experimentation within the safety of simulation without a complete rerun of the scenario. An environment that mimicked the scenario setting was created and students focused only on the parts of the scenario where they struggled. Practicing from the point of error, rather than repeating the entire scenario, saved time and helped students identify their areas of greatest need. The students’ reception of this intervention was positive. They found it solidified their knowledge and improved skill proficiency, which increased self-confidence.

Whether rerunning an entire scenario or providing targeted practice, there is more involved in active experimentation than simply repeating the same activity. It is important to consider everything that is involved in the redo process. Not only are actions being redone, but thoughts need to be reconsidered to ensure that the same mistakes are not made again. A facilitated debriefing session between the two simulation activities is of paramount importance.

Reflecting on the thoughts that led to actions displayed in the first run of the scenario helps students uncover inaccurate or incomplete thinking and discover areas for improvement. Learners form new ideas and begin thinking about future experiences. Active experimentation acknowledges the students’ need to act out what they gleaned from the previous stages and consolidate their learning (Secheresse et al., 2020).

The value of simulation-based education is attributed to the experiential learning process it provides. To optimize this pedagogy, educators should strive to include all four steps in Kolb’s cycle: a concrete experience, reflection, abstract conceptualization or thinking about future applications and active experimentation – trying out what was learned. Simulation educators should consider providing a means for participants to put their learning into action in the simulation lab.

As discussed, this can be accomplished by repeating the simulation scenario, allowing deliberate practice of newly acquired skills or other innovative simulation-based activities. No matter the approach, active experimentation in simulation maximizes the acquisition of both theoretical knowledge and psychomotor skill, creating more confident, self-aware clinicians. Making time for active experimentation in simulation decreases the likelihood that unapplied learning will be lost. It is time well spent.

Read More About Simulation-Based Learning

[toggles behavior=”toggle”]
[toggle title=”References:”]

  • Chmil, J. V., Turk, M., Adamson, K., & Larew, C. (2015). Effects of an Experiential Learning Simulation Design on Clinical Nursing Judgment Development. Nurse Educator, 40(5), 228–232.
  • Kolb, A. Y., Kolb, D. A., Passarelli, A., & Sharma, G. (2014). On Becoming an Experiential Educator. Simulation & Gaming, 45(2), 204–234. https://doi.org/10.1177/1046878114534383
  • Kolb, D. A. (2015). Experiential learning: experience as the source of learning and development. Pearson Education.
  • Maguire, M. B., & White, A. (2021). Immediate repeat of a septic shock simulation: Nursing students’ lived experience. Journal of Nursing Education and Practice, 11(9), 9. https://doi.org/10.5430/jnep.v11n9p9
  • Secheresse, T., Pansu, P., & Lima, L. (2020). The Impact of Full-Scale Simulation Training Based on Kolb’s Learning Cycle on Medical Prehospital Emergency Teams. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, 15(5), 335–340. https://doi.org/10.1097/sih.0000000000000461
  • Sivertsen, N., McNeill, L., & Müller, A. (2016). A Redo Station After Debrief Improves Learning in Undergraduate Nursing Simulation. Clinical Simulation in Nursing, 12(11), 469–472. https://doi.org/10.1016/j.ecns.2016.07.007
  • Zulkosky, K., Minchhoff, D., Dommel, L., Price, A., & Handzlik, B. M. (2021). Effect of Repeating Simulation Scenarios on Student Knowledge, Performance, Satisfaction and Self-Confidence. Clinical Simulation in Nursing, 55, 27–36. https://doi.org/10.1016/j.ecns.2021.03.004

[/toggle]
[/toggles]

Jeanne Carey Avatar
MEd, RN, CHSE-A
Instructional Design Specialist for Simulation
Jeanne Carey, MEd, RN, CHSE-A, is the Instructional Design Specialist for Simulation at UT Southwestern Medical Center in Dallas, Texas. She is a certified simulation educator with over 13 years of experience in all aspects of simulation, including the development and implementation of new simulation-based learning activities, professional development of simulation educators and operators, training of simulation facilitators, and recruitment and management of standardized patients. Prior to joining the simulation team at UTSW, Jeanne served as the Director of Simulation at Baylor University School of Nursing, where she and her team created the Two-Heads-Are-Better-Than-One (2HeadsR>1) strategy for role assignment in simulation. Jeanne is active in several simulation and higher education organizations.