Free
Research Article
Issue Date: September 01, 2014
Published Online: October 23, 2014
Updated: January 01, 2019
Use of Simulated Patients and Reflective Video Analysis to Assess Occupational Therapy Students’ Preparedness for Fieldwork
Author Affiliations
  • Amanda K. Giles, MS, OTR/L, is Instructor, Occupational Therapy, Department of Health Professions, Medical University of South Carolina, 151 Rutledge Avenue, Building B, MSC 962, Charleston, SC 29425; gilesak@musc.edu
  • Nancy E. Carson, PhD, is Assistant Professor; Hazel L. Breland, PhD, is Assistant Professor; Patty Coker-Bolt, PhD, is Associate Professor; and Peter J. Bowman, OTD, is Assistant Professor, Department of Health Professions, Medical University of South Carolina, Charleston
  • Nancy E. Carson, PhD, is Assistant Professor; Hazel L. Breland, PhD, is Assistant Professor; Patty Coker-Bolt, PhD, is Associate Professor; and Peter J. Bowman, OTD, is Assistant Professor, Department of Health Professions, Medical University of South Carolina, Charleston
  • Nancy E. Carson, PhD, is Assistant Professor; Hazel L. Breland, PhD, is Assistant Professor; Patty Coker-Bolt, PhD, is Associate Professor; and Peter J. Bowman, OTD, is Assistant Professor, Department of Health Professions, Medical University of South Carolina, Charleston
  • Nancy E. Carson, PhD, is Assistant Professor; Hazel L. Breland, PhD, is Assistant Professor; Patty Coker-Bolt, PhD, is Associate Professor; and Peter J. Bowman, OTD, is Assistant Professor, Department of Health Professions, Medical University of South Carolina, Charleston
Article Information
Education of OTs and OTAs / Conference Proceedings
Research Article   |   September 01, 2014
Use of Simulated Patients and Reflective Video Analysis to Assess Occupational Therapy Students’ Preparedness for Fieldwork
American Journal of Occupational Therapy, September/October 2014, Vol. 68, S57-S66. https://doi.org/10.5014/ajot.2014.685S03
American Journal of Occupational Therapy, September/October 2014, Vol. 68, S57-S66. https://doi.org/10.5014/ajot.2014.685S03
Abstract

Educators must determine whether occupational therapy students are adequately prepared for Level II fieldwork once they have successfully completed the didactic portion of their coursework. Although studies have shown that students regard the use of video cameras and simulated patient encounters as useful tools for assessing professional and clinical behaviors, little has been published in the occupational therapy literature regarding the practical application of simulated patients or reflective video analysis. We describe a model for a final Comprehensive Practical Exam that uses both simulated patients and reflective video analysis to assess student preparedness for Level II fieldwork, and we report on student perceptions of these instructional modalities. We provide recommendations for designing, implementing, and evaluating simulated patient experiences in light of existing educational theory.

Educators must determine whether occupational therapy students are adequately prepared for Level II fieldwork once they have successfully completed the didactic portion of their coursework. Simulated clinical encounters offer a useful way to assess student preparedness for fieldwork in that they provide a glimpse of how the student might react in a clinical setting with actual patients (Rutherford-Hemming & Jennrich, 2013). If an educator does not feel that the student is ready for fieldwork, he or she can provide specific and direct feedback to facilitate the student’s insight into his or her own abilities and motivate the student to develop a self-improvement plan. The use of video for self-reflection offers advantages that traditional faculty feedback alone cannot provide; video offers a realistic and nonthreatening means for the student to reflect on his or her own performance and make changes regarding future performance.
This article describes a model for a final comprehensive practical examination that uses both simulated patients and reflective video analysis to assess student preparedness for Level II fieldwork. We report on student perceptions of these instructional modalities and provide recommendations for designing, implementing, and evaluating simulated patient experiences in light of existing educational theory.
Educational Theory
The Occupational Therapy Curriculum Design Framework at the Medical University of South Carolina, Division of Occupational Therapy (2012), states,

We believe that the occupational nature of humans drives the framework for the educational program by creating and promoting opportunities for the following: (1) active participation and quiet reflection, (2) exploration and mastery of the human and nonhuman environments, and (3) respect for all peoples of the world and their occupational needs. (p. 3)

This belief is underpinned by the principles of constructivist learning theory, which postulate that the learner is responsible for constructing his or her own knowledge from experience. Methods of constructivism can be found in transformational learning, experiential learning, situated learning, and reflective practice (Merriam, Caffarella, & Baumgartner, 2007).
Transformational learning is about “change—dramatic, fundamental change in the way we see ourselves and the world in which we live” (Merriam et al., 2007, p. 130). Graduate occupational therapy students enter their coursework with significant prior knowledge about the world; consequently, educators must not only facilitate new knowledge (informational learning) but also transform old knowledge (transformational learning). Students accomplish this change by reflecting on new experiences to organize and expand their own sense of inner meaning. Through engagement in progressive simulated clinical experiences throughout the curriculum, the student’s view of self may transform from inexperienced student to future practitioner.
Similarly, models of experiential learning state that people learn from experience, as described by John Dewey (1938); David Kolb (1983; Kolb & Kolb, 2009); and Jean Piaget, Carl Jung, and others (see, e.g., Merriam et al., 2007; Qayumi, 2001). A key concept of experiential learning is that “learning is rooted in the situation in which the person participates, not in the head of that person as intellectual concepts” (Fenwick, 2003, p. 25). In other words, educators must connect subject matter to experience to provide students with meaning and understanding. For example, measuring range of motion in a client with spinal cord injury adds meaning to what the student read in a goniometry textbook. Students need rich lab experiences, such as simulated clinical encounters, to apply concepts to clinical situations.
From a situated learning perspective, the type of experience is critical. The experience must provide the just-right challenge for the student at the just-right time in his or her learning process. For example, a fieldwork educator would not expect the same level of clinical reasoning from a student in a Level I fieldwork experience as he or she would of a student in a Level II fieldwork experience. The role of the educator in the situated learning framework is to offer experiential opportunities that require real-life problem solving and also to provide “‘just-in-time’ assistance to enable confident action in situations where confident competence is lacking” (Fenwick, 2003, p. 121). Educational opportunities that require students to synthesize and apply content from across the curriculum offer a just-right challenge to students who are about to embark on Level II fieldwork.
Finally, educators must require that the student stop to reflect on experience in order to make lasting changes in behavior, skills, attitudes, and perspectives. Reflective practice is a “deliberate pause to assume an open perspective, to allow for higher-level thinking processes” (York-Barr, Sommers, Ghere, & Montie, 2001, p. 6). As opposed to reflection (thinking about thoughts), reflective practice is critically linked to experience (thinking about experience) and to efforts to improve that experience. Engaging in reflective practice after educational experiences, particularly interactions with simulated patients, facilitates professional growth.
Transformational learning, experiential learning, situated learning, and reflective practice are interrelated and support a learner-centered approach to teaching in which the student is an active participant. The development of the Comprehensive Practical Exam (CPE) discussed in this article was guided by the principles of constructivism using simulated patient encounters and reflective video analysis as active learning modalities.
Active Learning Modalities
Simulated Patient Encounters
A simulated patient is a healthy person who is trained to play the part of a patient in a standardized way for educational purposes. Encounters with simulated patients are similar to encounters with actual patients in regard to realism and accuracy but have the added advantage of presenting physical, psychological, and emotional aspects of a disease in a textbook manner (Rutherford-Hemming & Jennrich, 2013; Tai & Chung, 2008): “The use of simulated clients provides a bridge between traditional didactic learning, case-based learning, and actual practice” (Velde, Lane, & Clay, 2009, p. e-17).
Simulation is increasingly being used by various medical professions to evaluate student performance before beginning clinical practice (Chang, Boscardin, Chou, Loeser, & Hauer, 2009; Parekh & Thorpe, 2012; Rutherford-Hemming & Jennrich, 2013; Velde et al., 2009; Weaver & Erby, 2012). Assessments using simulated patients are often designed in one of two ways: (1) as brief encounters to assess a single clinical skill, otherwise known as an objective structured clinical exam (OSCE), or (2) as complex encounters to attain a more holistic view of the student’s ability in multiple areas of evaluation and treatment, otherwise known as a simulated clinical encounter (Prislin et al., 1998). Studies have shown that students value feedback from simulated patients as much as feedback from educators (McLaughlin, Gregor, Jones, & Coderre, 2006). The use of simulated patients may require more organization, financial resources, and time than traditional lab activities but offers invaluable opportunities for experiential learning, feedback, and reflection.
Reflective Video Analysis
Ability to reflect on one’s performance in an objective, constructive manner is a professional skill that students should practice to promote lifelong learning (Maloney, Storr, Morgan, & Ilic, 2013). Reflective video analysis includes video recording of student performance followed by reflection or critique by the student and educator regarding that performance. Deeper learning outcomes have been shown when student reflective video analysis is paired with feedback from the educator (Mayer, 2003).
The use of reflective video analysis is a beneficial adjunct to encounters with simulated patients. A variety of disciplines have documented the benefits of reflective video analysis for improving clinical skills, OSCE scores, nonverbal and verbal communication, interview skills, self-assessment, and skill in addressing patient concerns (Lane & Gottlieb, 2004; Maloney et al., 2013; Perera, Mohamadou, & Kaur, 2010; Seif & Brown, 2013;Yoo, Son, Kim, & Park, 2009).
Comprehensive Practical Exam
Setting
The Comprehensive Practical Exam was conducted in the Division of Occupational Therapy at the Medical University of South Carolina (MUSC) as part of the occupational therapy education curriculum. The MUSC Institutional Review Board confirmed the exempt status of our study.
The MUSC occupational therapy program follows a seven-semester integrated curriculum. During Semesters 1–5, students complete the didactic coursework and Level I fieldwork and community experiences. During Semesters 6 and 7, students complete two 12-wk Level II fieldwork experiences. The CPE takes place in the final semester of didactic coursework before Level II fieldwork and has been conducted annually since 2010.
Participants
Two classes of 2nd-yr occupational therapy students, the classes of 2013 and 2014, individually participated in the formative CPE in the year before graduation (N = 88). For each class of students, 6 occupational therapy faculty members participated as primary evaluators and feedback communicators regarding student performance. Additionally, 6 nonfaculty occupational therapists were recruited and trained to play the part of simulated patients and secondary evaluators.
Procedure
Student Preparation for the Exam.
Each student was provided with a study guide 8 wk before taking the CPE. Learning objectives were clearly outlined and aligned with grading criteria. Successful completion of the CPE was noted as a requirement in the syllabus of a core clinical lab course. The CPE was scored on a pass–no pass basis. CPE content included all prior clinical coursework, including knowledge of evaluation and treatment of various musculoskeletal, neurological, and psychological diagnoses. In contrast to prior lab practical assessments, which focused on specific individual skills, the CPE required students to synthesize and apply information from all clinical coursework and apply it to a medically complex patient. Students signed a release form to give consent to record course-related activities and an honor code form to acknowledge that they were not allowed to share with others information observed during the CPE.
Students were encouraged to prepare for the CPE by organizing prior course notes, reviewing the material as a whole, and thinking critically about how course content overlapped and interrelated. For a given diagnosis, students were prompted to think about possible precautions, the impact of the diagnosis on occupational performance, commonly recommended adaptive equipment, and anticipated rehabilitation potential and treatment goals. In addition, students were challenged to consider how to approach a client with multiple diagnoses and how to build therapeutic rapport. Before taking the CPE, students participated in a practice practical that allowed them to rehearse all steps in the CPE using 1st-yr occupational therapy students as simulated patients.
Training of Simulated Patients.
We trained simulated patients using a CPE training manual and two training videos that included information on role expectations for the nonfaculty therapists as simulated patients and expectations for student performance. For example, the nonfaculty therapists were asked to display one instance of unsafe behavior from a given list of choices (e.g., increase fall risk by leaning outside their base of support) to allow us to assess the student’s ability to respond to a hazardous situation. Each nonfaculty therapist was randomly paired with one faculty member from the Division of Occupational Therapy to assess student performance during the CPE. The faculty members met with the nonfaculty therapists before and after the CPE to ensure the therapists understood the expectations and to discuss student performance.
Elements of the Comprehensive Practical Exam
Grading criteria centered around five key elements: (1) evaluation, (2) treatment, (3) documentation, (4) safety and judgment, and (5) overall flow (Figure 1). Of importance was each student’s ability to demonstrate safety and clear judgment during evaluation, treatment, and documentation while maintaining a confident and well-organized approach throughout the entire patient encounter.
Figure 1.
Five key elements of the Comprehensive Practical Exam.
Figure 1.
Five key elements of the Comprehensive Practical Exam.
×
Evaluation.
The student was initially provided with a case diagnosis and very limited information, similar to what would be found in a quick chart review. The student was given 5 min to write down and prioritize the major areas of occupation, client factors, and performance skills that might need to be evaluated. The purpose of this activity was to ensure student understanding of the Occupational Therapy Practice Framework: Domain and Process (2nd ed.; American Occupational Therapy Association [AOTA], 2008) as it relates to occupational therapy evaluation and to determine the student’s ability to anticipate an evaluation strategy before meeting a client. We used the 2nd edition of the Framework in creating this exam; the current, 3rd edition (AOTA, 2014) was published after this project was completed.
The student was asked to evaluate the client in two of four possible areas (occupation, range of motion, strength, and sensation) and then document the results of evaluation. The purpose of this activity was to determine the student’s ability to accurately evaluate and document areas within the occupational therapy practice domain.
Treatment.
Next, the student was provided with a one-page evaluation summary that represented what an occupational therapist might document in the subjective and objective section of an evaluation note. The student was given 5 min to review the evaluation results and write an appropriate short-term goal. The purpose of this activity was to determine the student’s ability to create an appropriate, measurable, functional, and meaningful short-term goal related to the evaluation results in a timely manner.
The student was then provided with three short-term goals for which he or she was required to design and implement an appropriate client-centered treatment. The student was given 5 min to prepare for treatment using the equipment available in the exam room and 15 min to implement the treatment. The goals were purposely organized in a way that would allow the student to address all three goals within the given time period (e.g., goals for a functional transfer, attention to task, and functional activity). On the basis of the client’s response to treatment, the student was expected to grade the task approach and environment to maximize client performance. The purpose of this activity was to determine the student’s ability to prepare, implement, and modify a client-centered treatment in a safe, organized, and timely manner.
Documentation.
The student was required to document the evaluation results and an appropriate short-term goal. At the end, the student was required to document the treatment session using a SOAP note format within 15 min in a separate exam room (Borcherding & Gately, 2011). The purpose of this activity was to determine the student’s ability to provide clear, concise, and accurate documentation in a timely manner.
Safety and Judgment.
Throughout evaluation and treatment, the student was expected to adhere consistently to safety regulations, including known medical and fall prevention precautions. The student was expected to anticipate potentially hazardous situations and use sound judgment regarding the safety of client and self.
Overall Flow.
Throughout the evaluation and treatment, the student was expected to communicate with the client in a respectful manner, exhibit confidence and professionalism consistently, and educate the client regarding the purpose of treatment. The student was also expected to display empathy and compassion for the client. The student was expected to respond in an appropriate manner to two or more occurrences of unexpected behavior or verbalizations during the treatment session, such as the patient’s expression of pain, anger, sadness, or grief; to inappropriate comments or self-disclosures; or to demonstration of flat or inappropriate affect.
Use of Video and Reflective Video Analysis
The entire evaluation and treatment process was video recorded for each student using overhead camera systems containing standard-definition cameras (Sony EVI-D70, Sony Corporation, Tokyo) that were monitored in real time by technical support staff in a master control room. Within 2 days after the CPE, recordings were transferred to individual DVDs with the aid of technical support staff.
Before receiving any feedback, each student viewed his or her individual DVD on a personal laptop computer during a 45-min period. Students in the class of 2013 made individual appointments to view their video 1 wk after the CPE. To expedite the reflection and feedback process, students in the class of 2014 viewed their DVDs during regular class time 2 days after the CPE. During this time, students completed a reflective video analysis worksheet to critically analyze their performance related to evaluation, treatment, documentation, safety and judgment, and overall flow using the same grading criteria as those identified on the score sheet. Students were also encouraged to independently identify a self-improvement plan to address areas for improvement as necessary.
Grading Criteria for the Comprehensive Practical Exam
Grading criteria were developed in alignment with the AOTA Fieldwork Performance Evaluation for the Occupational Therapy Student (AOTA, 2002) and were reflected in the learning objectives in the CPE study guide. Using the CPE score sheet, the faculty member and therapist discussed observations and scored students on a pass–no pass basis in each of the five key areas of the CPE: evaluation, treatment, documentation, safety and judgment, and overall flow. A pass score indicated that the student performed at or above a satisfactory level for transitioning to Level II fieldwork. A no-pass score indicated that a remediation plan was required before beginning Level II fieldwork. Our expectations for a satisfactory level of performance were similar to what a fieldwork educator would use based on the AOTA Fieldwork Performance Evaluation for the Occupational Therapy Student (AOTA, 2002).
Faculty and Therapist Feedback Process
Feedback was withheld until after students completed their independent reflective video analysis (Figure 2). An individual faculty meeting was scheduled in which the student shared self-reflections related to personal strengths and areas for improvement and a detailed self-improvement plan if necessary. The faculty member then provided faculty and therapist feedback regarding the student’s performance on the CPE. The faculty member also commented on the accuracy of the student’s own perception of his or her performance as revealed by the reflective video analysis worksheet.
Figure 2.
Order of events for the Comprehensive Practical Exam.
Figure 2.
Order of events for the Comprehensive Practical Exam.
×
Together the faculty and student finalized the student’s plan for improvement as needed. If the student received a no-pass score in any area, a formal remediation plan was established for achievement of competency. Remediation plans were individualized and highly related to the student’s level of awareness of the deficit area expressed before receiving faculty feedback. The student was typically provided mentorship regarding the deficit areas followed by a retest of performance as needed. For example, a student who displayed deficits in documentation was asked to rewrite the CPE SOAP note and then to write a separate SOAP note based on an unrelated community-based treatment activity in which the student was already involved. In some cases, students underwent the entire CPE process again with a different case study that was selected before the initial CPE was administered.
Results
Assessment of Student Perceptions
We analyzed post-CPE feedback from 23 of 42 students in the class of 2013 (55% response rate) and 42 of 46 students in the class of 2014 (91% response rate). Student ages ranged from 24 to 39 yr (mean = 26 yr). Nearly all students were female (99%), and 100% of students were U.S. residents. Of the 88 graduate occupational therapy students who took the CPE, 70 (80%) passed successfully on the first attempt, and the remaining 18 (20%) passed successfully after remediation in at least one area (evaluation, treatment, documentation, safety and judgment, or overall flow).
We collected post-CPE feedback after students received faculty feedback on performance. For students in the class of 2013, 2 open-ended survey questions were distributed and collected using SurveyMonkey (SurveyMonkey, Palo Alto, CA), a free online questionnaire and survey software. The 2 open-ended questions asked students to identify the benefits of the CPE and offer suggestions for improvement. For students in the class of 2014, 8 additional quantitative questions were added to the open-ended questions for a total of 10 questions that were distributed and collected using REDCap™ (Research Electronic Data Capture, Vanderbilt University, Nashville, TN). REDCap is a secure, Web-based application designed exclusively to support data capture for research studies (Harris et al., 2009). Seven of the 8 quantitative questions had students rate the value of specific features of the CPE using a 5-point Likert scale (ranging from 1 = strongly agree to 5 = strongly disagree;Table 1). The final quantitative question asked students to report how they had prepared for the CPE (Figure 3).
Table 1.
Post–CPE Student Feedback (N = 42)
Post–CPE Student Feedback (N = 42)×
StatementStudents Who Strongly Agreed or Agreed
n%
I valued using a simulated patient.a3788
I valued the participation of the nonfaculty occupational therapist.a,b3995
I valued the use of video for self-reflection.b3175
Participation in a timed practical increased my awareness of the importance of clinical time management skills.a,b3585
Participation in the CPE helped me prepare for Level II fieldwork.3993
Participation in the CPE increased my confidence level as an occupational therapy student.a3071
I would recommend continuation of the CPE in future.a3483
Table Footer NoteNote. CPE = comprehensive practical exam.
Note. CPE = comprehensive practical exam.×
Table Footer NoteaQuestions asked of the class of 2014 only.
Questions asked of the class of 2014 only.×
Table Footer NotebN = 41.
N = 41.×
Table 1.
Post–CPE Student Feedback (N = 42)
Post–CPE Student Feedback (N = 42)×
StatementStudents Who Strongly Agreed or Agreed
n%
I valued using a simulated patient.a3788
I valued the participation of the nonfaculty occupational therapist.a,b3995
I valued the use of video for self-reflection.b3175
Participation in a timed practical increased my awareness of the importance of clinical time management skills.a,b3585
Participation in the CPE helped me prepare for Level II fieldwork.3993
Participation in the CPE increased my confidence level as an occupational therapy student.a3071
I would recommend continuation of the CPE in future.a3483
Table Footer NoteNote. CPE = comprehensive practical exam.
Note. CPE = comprehensive practical exam.×
Table Footer NoteaQuestions asked of the class of 2014 only.
Questions asked of the class of 2014 only.×
Table Footer NotebN = 41.
N = 41.×
×
Figure 3.
Results of the post–Comprehensive Practical Exam feedback: Student responses to the final quantitative question, How did you prepare for this experience?”
Figure 3.
Results of the post–Comprehensive Practical Exam feedback: Student responses to the final quantitative question, How did you prepare for this experience?”
×
Using thematic content analysis, we analyzed the students’ perspectives reflected in their responses to the open-ended questions to identify primary codes and to generate a summary statement for each code. The qualitative analytic process of thematic content analysis allowed us to identify the significant elements of students’ perspectives and is a useful approach to answering questions about the issues of most importance to a particular group of participants. The process is comparative; the various accounts of the individual participants are compared with each other to classify themes that reappear or are common in the data set (Green & Thorogood, 2004). We repeated this process for each student’s response to the questions. Recurring codes were identified and consolidated through the development of themes. Coding was validated by consensus by two faculty members (authors Breland and Giles).
Themes in Student Responses
Student responses to the quantitative questions are summarized in Table 1 and Figure 3. In response to the qualitative questions, five major themes emerged regarding the benefits of the CPE. First, participation in the CPE was a “good review for upcoming fieldwork”:
  • It made you review classes you hadn’t had in a while and practice skills that may have been a little rusty.

  • I now feel very prepared for Level II experiences.

  • The practical exam forced me to review material that I had intended on reviewing, but likely would have not reviewed due to lack of time.

Second, simulated patient encounters were “helpful” in providing a “real-life therapy session”:
  • It was beneficial to me to practice being under pressure with the stopwatch and the videotape.

  • It tests what we will actually be doing as a future occupational therapist rather than just a written examination.

Third, video self-reflection “was the best part because I could see how I actually looked”:
  • It was a great way to see my actual performance and how it related to the preconceived opinion I had of myself as a student therapist.

  • You may think that you act one way and then you watch the video and realize how you are actually coming across.

  • I was able to clearly see the mistakes I made on the video, which helped me realize areas that I could improve in prior to my fieldwork.

Fourth, constructive feedback was appreciated by both faculty and students as “active therapists”:
  • Really great way for professors to see how well you are doing and give you valuable feedback.

  • Faculty gave very constructive criticism and didn’t make me feel like the mistakes I made were things that would keep me from becoming a great occupational therapist.

Fifth, the CPE was a “confidence booster”:
  • It allowed me to see how capable I really am!

  • I feel like it gave me a chance to work out my clinical butterflies before fieldwork began.

  • It made me feel better knowing that I can think of activities on the spot and handle situations that might arise.

Thematic content analysis also revealed several shared recommendations for changes in the CPE. One recommendation was to find ways to lower student test anxiety:
  • My anxiety levels were definitely higher than they needed to be!

  • I think it’s important to stress that this is not a “do or die” situation and it is OK to remediate a portion.

A second recommendation was to improve the quality of the videos:

Video was helpful but I know personally I would have liked to see my face/body language from the front.

Finally, students recommended the use of simulated encounters and reflective video analysis more often in the curriculum:

Excellent way to prepare for fieldwork and wish we would have done it more.

Discussion
OSCEs and simulated clinical encounters are routinely used in many health-related education programs as a way to measure student competency at the end of their didactic curriculum (Chang et al., 2009; Parekh & Thorpe, 2012; Rutherford-Hemming & Jennrich, 2013; Tai & Chung, 2008; Velde et al., 2009; Weaver & Erby, 2012). This article highlights a model for a final comprehensive practical examination that used both simulated patients and reflective video analysis to assess occupational therapy student preparedness for Level II fieldwork.
Student feedback offered insight into how the CPE aligns with current educational theory. For example, students reported a mental shift from student to therapist (transformational learning) on successful completion of the CPE. Students referred to themselves as “a student therapist,” “future occupational therapist,” and “becoming a great OT [occupational therapist].” By requiring students to apply clinical coursework from across the curriculum to a complex clinical scenario, the CPE provided the springboard for transitioning from formal didactic work to integrated clinical work.
The nature of the student feedback also reflected the need for clinical preparation in a simulated setting (experiential learning) with greater expectations as the student progressed in his or her coursework (situated learning). Students reported that acting like a therapist helped them identify what it takes to become an occupational therapist. Although this exam was designed to provide a higher level of challenge to students who had completed their didactic coursework, students reported the desire for simulated clinical encounters earlier in the curriculum.
Finally, student feedback supported the use of reflective video analysis (reflective practice) as an organized way to identify strengths and areas for improvement before starting Level II fieldwork. Student reflections are more effective when combined with feedback from perceived experts in a way that builds confidence and a desire to improve (Merriam et al., 2007).
Implications for Occupational Therapy Educators
The role of the educator is to prepare occupational therapy students to be successful during Level II fieldwork and future clinical practice. The burden on the educator is to develop effective activities and assessment tools that nurture competent and compassionate future clinicians. Success can be measured by the student’s ability to demonstrate safety and clear judgment during evaluation, treatment, and documentation while maintaining a confident and well-organized approach throughout the entire patient encounter. The CPE is a useful tool to promote and evaluate learning in all of these areas.
We offer the following recommendations to educators interested in introducing a similar CPE within their curricula; these recommendations are based on student responses, faculty and therapist discussion, and principles of current educational theory:
  • To add valuable feedback for students, educators can recruit experienced nonfaculty occupational therapists to perform as simulated patients. Students reported a high appreciation for the feedback provided by experienced nonfaculty clinicians. Experienced clinicians may require less training time and are often willing to volunteer for this role, which can eliminate the cost of a paid actor.

  • To nurture a compassionate approach to future clients, the score sheet should reflect the student’s ability to respond to the patient’s emotional and psychosocial needs (Teherani, Hauer, & O’Sullivan, 2008).

  • To preserve the meaningfulness of feedback, feedback and videos for reflective video analysis should be provided as soon as possible, perhaps by scheduling ahead of time appointments for viewing video and meeting with faculty: “Capitalizing on a teaching moment shortly following a meaningful interaction with faculty could provide a more optimal opportunity to edify the teaching and learning points” (Garner, Gusberg, & Kim, 2014, p. 391).

  • To maximize learning, educators should provide descriptive feedback and opportunities for remediation so that the CPE is viewed as an assessment for learning rather than an assessment of learning. Effective feedback motivates students and provides information to help them correct performance (Crooks, 1988; Kluger & DeNisi, 1996).

  • To promote independence in self-reflection as a strategy for lifelong learning, all students should be required to perform a reflective video analysis, regardless of CPE performance. Student reflections should be combined with faculty and therapist feedback. Research has shown that when students are in control of and use feedback to modify their own performance, they are more self-regulated, resourceful, confident, and higher achieving (Chappuis & Chappuis, 2007).

  • To increase interrater reliability, scoring mechanisms should allow for clarity of student expectations (Tai & Chung, 2008). Checklists should have objective, measurable criteria that meet minimum satisfactory standards aligned with the AOTA Fieldwork Performance Evaluation for the Occupational Therapy Student (AOTA, 2002).

  • To support the significance of the CPE, the CPE can either be integrated into an existing course or constitute its own one-credit course (McWilliam & Botwinski, 2010).

  • To provide meaning and understanding to didactic course content, additional simulation and reflection experiences should be staggered throughout the curriculum in increasing levels of complexity (Fenwick, 2003).

  • To ensure the overall success of the CPE, adequate time should be allowed for the development of objectives, case scenarios, and grading criteria and for the training of simulated patients, reflection time, and faculty feedback (McWilliam & Botwinski, 2010).

Limitations
Several limitations of this study should be noted before attempting to generalize the results to other occupational therapy educational programs. This CPE was designed specifically to meet the needs of the occupational therapy educational program at MUSC and was embedded in a specific course and program curriculum. The survey used to collect student feedback was part of routine evaluation of course activities and is not a validated educational assessment. All students enrolled in the program participated in the CPE as part of a course requirement; therefore, it was not possible to compare their perceptions with the perceptions of students who participated in a different but complementary educational activity. Although the faculty involved in the CPE collaborated on the development of the student assessment form and procedures for the exam, variation may be found in the type and extent of faculty feedback provided to individual students.
Directions for Future Research
Future education research on student learning outcomes related to this CPE could look at the relationship between student scores on the CPE and scores on Level II fieldwork forms and between student CPE scores and Level II fieldwork supervisor feedback regarding student readiness. In addition, future studies could explore the relationships among faculty feedback, therapist feedback, and student self-reflections.
Acknowledgments
We thank the community occupational therapists who participated as simulated patients, and we thank the MUSC College of Health Professions’ production manager, Gary Melancon, and Professor and Assistant Dean Maralynne D. Mitcham for their editing support. This article is based on a paper presented at the American Occupational Therapy Association Education Summit, October 2013, Atlanta, GA. This article is presented in partial fulfillment of the requirements for the degree of Transitional Doctor of Occupational Therapy at the University of St. Augustine for Health Sciences.
References
American Occupational Therapy Association. (2002). Fieldwork performance evaluation for the occupational therapy student. Bethesda, MD: Author.
American Occupational Therapy Association. (2002). Fieldwork performance evaluation for the occupational therapy student. Bethesda, MD: Author.×
American Occupational Therapy Association. (2008). Occupational therapy practice framework: Domain and process (2nd ed.). American Journal of Occupational Therapy, 62, 625–683. http://dx.doi.org/10.5014/ajot.62.6.625 [Article] [PubMed]
American Occupational Therapy Association. (2008). Occupational therapy practice framework: Domain and process (2nd ed.). American Journal of Occupational Therapy, 62, 625–683. http://dx.doi.org/10.5014/ajot.62.6.625 [Article] [PubMed]×
American Occupational Therapy Association. (2014). Occupational therapy practice framework: Domain and process (3rd ed.). American Journal of Occupational Therapy, 68(Suppl. 1), S1–S48. http://dx.doi.org/10.5014.ajot.2014.682006 [Article]
American Occupational Therapy Association. (2014). Occupational therapy practice framework: Domain and process (3rd ed.). American Journal of Occupational Therapy, 68(Suppl. 1), S1–S48. http://dx.doi.org/10.5014.ajot.2014.682006 [Article] ×
Borcherding, S., & Gateley, C. A. (2011). Documentation manual for occupational therapy: Writing SOAP notes. Thorofare, NJ: Slack.
Borcherding, S., & Gateley, C. A. (2011). Documentation manual for occupational therapy: Writing SOAP notes. Thorofare, NJ: Slack.×
Chang, A., Boscardin, C., Chou, C. L., Loeser, H., & Hauer, K. E. (2009). Predicting failing performance on a standardized patient clinical performance examination: The importance of communication and professionalism skills deficits. Academic Medicine, 84(Suppl.), S101–S104. [Article] [PubMed]
Chang, A., Boscardin, C., Chou, C. L., Loeser, H., & Hauer, K. E. (2009). Predicting failing performance on a standardized patient clinical performance examination: The importance of communication and professionalism skills deficits. Academic Medicine, 84(Suppl.), S101–S104. [Article] [PubMed]×
Chappuis, S., & Chappuis, J. (2007). The best value in formative assessment. Educational Leadership, 65, 14–19.
Chappuis, S., & Chappuis, J. (2007). The best value in formative assessment. Educational Leadership, 65, 14–19.×
Crooks, T. J. (1988). The impact of classroom evaluation practices on students. Review of Educational Research, 58, 438–481. [Article]
Crooks, T. J. (1988). The impact of classroom evaluation practices on students. Review of Educational Research, 58, 438–481. [Article] ×
Dewey, J. (1938). Experience and education. New York: Simon & Schuster.
Dewey, J. (1938). Experience and education. New York: Simon & Schuster.×
Fenwick, T. (2003). Learning through experience: Troubling orthodoxies and intersecting questions. Malabar, FL: Krieger.
Fenwick, T. (2003). Learning through experience: Troubling orthodoxies and intersecting questions. Malabar, FL: Krieger.×
Garner, M. S., Gusberg, R. J., & Kim, A. W. (2014). The positive effect of immediate feedback on medical student education during the surgical clerkship. Journal of Surgical Education, 71, 391–397. http://dx.doi.org/10.1016/j.jsurg.2013.10.009 [Article] [PubMed]
Garner, M. S., Gusberg, R. J., & Kim, A. W. (2014). The positive effect of immediate feedback on medical student education during the surgical clerkship. Journal of Surgical Education, 71, 391–397. http://dx.doi.org/10.1016/j.jsurg.2013.10.009 [Article] [PubMed]×
Green, J., & Thorogood, N. (2004). Qualitative methods for health research. London: Sage.
Green, J., & Thorogood, N. (2004). Qualitative methods for health research. London: Sage.×
Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009). Research Electronic Data Capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42, 377–381. [Article] [PubMed]
Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009). Research Electronic Data Capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42, 377–381. [Article] [PubMed]×
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119, 254–284. http://dx.doi.org/10.1037/0033-2909.119.2.254 [Article]
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119, 254–284. http://dx.doi.org/10.1037/0033-2909.119.2.254 [Article] ×
Kolb, D. A. (1983). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall.
Kolb, D. A. (1983). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall.×
Kolb, A. Y., & Kolb, D. A. (2009). Experiential learning theory: A dynamic, holistic approach to management learning, education and development. In S. J. Armstrong & C. V. Fukami (Eds.), The SAGE handbook of management learning, education and development (pp. 42–68). London: Sage.
Kolb, A. Y., & Kolb, D. A. (2009). Experiential learning theory: A dynamic, holistic approach to management learning, education and development. In S. J. Armstrong & C. V. Fukami (Eds.), The SAGE handbook of management learning, education and development (pp. 42–68). London: Sage.×
Lane, J. L., & Gottlieb, R. P. (2004). Improving the interviewing and self-assessment skills of medical students: Is it time to readopt videotaping as an educational tool? Ambulatory Pediatrics, 4, 244–248. http://dx.doi.org/10.1367/A03-122R1.1 [Article] [PubMed]
Lane, J. L., & Gottlieb, R. P. (2004). Improving the interviewing and self-assessment skills of medical students: Is it time to readopt videotaping as an educational tool? Ambulatory Pediatrics, 4, 244–248. http://dx.doi.org/10.1367/A03-122R1.1 [Article] [PubMed]×
Maloney, S., Storr, M., Morgan, P., & Ilic, D. (2013). The effect of student self-video of performance on clinical skill competency: A randomised controlled trial. Advances in Health Science Education, 18, 81–89. http://dx.doi.org/10.1007/s10459-012-9356-1 [Article]
Maloney, S., Storr, M., Morgan, P., & Ilic, D. (2013). The effect of student self-video of performance on clinical skill competency: A randomised controlled trial. Advances in Health Science Education, 18, 81–89. http://dx.doi.org/10.1007/s10459-012-9356-1 [Article] ×
Mayer, R. E. (2003). The promise of multimedia learning: Using the same instructional design methods across media. Learning and Instruction, 13, 125–139. http://dx.doi.org/10.1016/S0959-4752(02)00016-6 [Article]
Mayer, R. E. (2003). The promise of multimedia learning: Using the same instructional design methods across media. Learning and Instruction, 13, 125–139. http://dx.doi.org/10.1016/S0959-4752(02)00016-6 [Article] ×
McLaughlin, K., Gregor, L., Jones, A., & Coderre, S. (2006). Can standardized patients replace physicians as OSCE examiners? BMC Medical Education, 6, 12. http://dx.doi.org/10.1186/1472-6920-6-12 [Article] [PubMed]
McLaughlin, K., Gregor, L., Jones, A., & Coderre, S. (2006). Can standardized patients replace physicians as OSCE examiners? BMC Medical Education, 6, 12. http://dx.doi.org/10.1186/1472-6920-6-12 [Article] [PubMed]×
McWilliam, P., & Botwinski, C. (2010). Developing a successful nursing objective structured clinical examination. Journal of Nursing Education, 49, 36–41. http://dx.doi.org/10.3928/01484834-20090915-01 [Article] [PubMed]
McWilliam, P., & Botwinski, C. (2010). Developing a successful nursing objective structured clinical examination. Journal of Nursing Education, 49, 36–41. http://dx.doi.org/10.3928/01484834-20090915-01 [Article] [PubMed]×
Medical University of South Carolina, Division of Occupational Therapy. (2012). Curriculum design framework. Retrieved from http://academicdepartments.musc.edu/chp/clined/ot/manual/framework.htm
Medical University of South Carolina, Division of Occupational Therapy. (2012). Curriculum design framework. Retrieved from http://academicdepartments.musc.edu/chp/clined/ot/manual/framework.htm×
Merriam, S. B., Caffarella, R. S., & Baumgartner, L. M. (2007). Learning in adulthood: A comprehensive guide. San Francisco: Wiley.
Merriam, S. B., Caffarella, R. S., & Baumgartner, L. M. (2007). Learning in adulthood: A comprehensive guide. San Francisco: Wiley.×
Parekh, A., & Thorpe, T. (2012). How should we teach undergraduates in simulation scenarios? Clinical Teacher, 9, 280–284. http://dx.doi.org/10.1111/j.1743-498X.2012.00552.x [Article] [PubMed]
Parekh, A., & Thorpe, T. (2012). How should we teach undergraduates in simulation scenarios? Clinical Teacher, 9, 280–284. http://dx.doi.org/10.1111/j.1743-498X.2012.00552.x [Article] [PubMed]×
Perera, J., Mohamadou, G., & Kaur, S. (2010). The use of objective structured self-assessment and peer-feedback (OSSP) for learning communication skills: Evaluation using a controlled trial. Advances in Health Sciences Education: Theory and Practice, 15, 185–193. http://dx.doi.org/10.1007/s10459-009-9191-1 [Article] [PubMed]
Perera, J., Mohamadou, G., & Kaur, S. (2010). The use of objective structured self-assessment and peer-feedback (OSSP) for learning communication skills: Evaluation using a controlled trial. Advances in Health Sciences Education: Theory and Practice, 15, 185–193. http://dx.doi.org/10.1007/s10459-009-9191-1 [Article] [PubMed]×
Prislin, M. D., Fitzpatrick, C. F., Lie, D., Giglio, M., Radecki, S., & Lewis, E. (1998). Use of an objective structured clinical examination in evaluating student performance. Family Medicine, 30, 338–344. [PubMed]
Prislin, M. D., Fitzpatrick, C. F., Lie, D., Giglio, M., Radecki, S., & Lewis, E. (1998). Use of an objective structured clinical examination in evaluating student performance. Family Medicine, 30, 338–344. [PubMed]×
Qayumi, S. (2001). Piaget and his role in problem-based learning. Journal of Investigative Surgery, 14, 63–65. http://dx.doi.org/10.1080/08941930152024165 [Article] [PubMed]
Qayumi, S. (2001). Piaget and his role in problem-based learning. Journal of Investigative Surgery, 14, 63–65. http://dx.doi.org/10.1080/08941930152024165 [Article] [PubMed]×
Rutherford-Hemming, T., & Jennrich, J. A. (2013). Using standardized patients to strengthen nurse practitioner competency in the clinical setting. Nursing Education Perspectives, 34, 118–121. http://dx.doi.org/10.5480/1536-5026-34.2.118 [Article] [PubMed]
Rutherford-Hemming, T., & Jennrich, J. A. (2013). Using standardized patients to strengthen nurse practitioner competency in the clinical setting. Nursing Education Perspectives, 34, 118–121. http://dx.doi.org/10.5480/1536-5026-34.2.118 [Article] [PubMed]×
Seif, G. A., & Brown, D. (2013). Video-recorded simulated patient interactions: Can they help develop clinical and communication skills in today’s learning environment? Journal of Allied Health, 42, e37–e44. [PubMed]
Seif, G. A., & Brown, D. (2013). Video-recorded simulated patient interactions: Can they help develop clinical and communication skills in today’s learning environment? Journal of Allied Health, 42, e37–e44. [PubMed]×
Tai, C. Y., & Chung, U. L. (2008). The development of a competency-based group health teaching performance examination model for BSN graduates. Journal of Nursing Research, 16, 275–285. [Article] [PubMed]
Tai, C. Y., & Chung, U. L. (2008). The development of a competency-based group health teaching performance examination model for BSN graduates. Journal of Nursing Research, 16, 275–285. [Article] [PubMed]×
Teherani, A., Hauer, K. E., & O’Sullivan, P. (2008). Can simulations measure empathy? Considerations on how to assess behavioral empathy via simulations. Patient Education and Counseling, 71, 148–152. http://dx.doi.org/10.1016/j.pec.2008.01.003 [Article] [PubMed]
Teherani, A., Hauer, K. E., & O’Sullivan, P. (2008). Can simulations measure empathy? Considerations on how to assess behavioral empathy via simulations. Patient Education and Counseling, 71, 148–152. http://dx.doi.org/10.1016/j.pec.2008.01.003 [Article] [PubMed]×
Velde, B. P., Lane, H., & Clay, M. (2009). Hands on learning: The use of simulated clients in intervention cases. Journal of Allied Health, 38, e17–e21. [PubMed]
Velde, B. P., Lane, H., & Clay, M. (2009). Hands on learning: The use of simulated clients in intervention cases. Journal of Allied Health, 38, e17–e21. [PubMed]×
Weaver, M., & Erby, L. (2012). Standardized patients: A promising tool for health education and health promotion. Health Promotion Practice, 13, 169–174. http://dx.doi.org/10.1177/1524839911432006 [Article] [PubMed]
Weaver, M., & Erby, L. (2012). Standardized patients: A promising tool for health education and health promotion. Health Promotion Practice, 13, 169–174. http://dx.doi.org/10.1177/1524839911432006 [Article] [PubMed]×
Yoo, M. S., Son, Y. J., Kim, Y. S., & Park, J. H. (2009). Video-based self-assessment: Implementation and evaluation in an undergraduate nursing course. Nurse Education Today, 29, 585–589. http://dx.doi.org/10.1016/j.nedt.2008.12.008 [Article] [PubMed]
Yoo, M. S., Son, Y. J., Kim, Y. S., & Park, J. H. (2009). Video-based self-assessment: Implementation and evaluation in an undergraduate nursing course. Nurse Education Today, 29, 585–589. http://dx.doi.org/10.1016/j.nedt.2008.12.008 [Article] [PubMed]×
York-Barr, J., Sommers, W. A., Ghere, G. S., & Montie, J. (2001). Reflective practice to improve schools: An action guide for educators. Thousand Oaks, CA: Corwin Press.
York-Barr, J., Sommers, W. A., Ghere, G. S., & Montie, J. (2001). Reflective practice to improve schools: An action guide for educators. Thousand Oaks, CA: Corwin Press.×
Figure 1.
Five key elements of the Comprehensive Practical Exam.
Figure 1.
Five key elements of the Comprehensive Practical Exam.
×
Figure 2.
Order of events for the Comprehensive Practical Exam.
Figure 2.
Order of events for the Comprehensive Practical Exam.
×
Figure 3.
Results of the post–Comprehensive Practical Exam feedback: Student responses to the final quantitative question, How did you prepare for this experience?”
Figure 3.
Results of the post–Comprehensive Practical Exam feedback: Student responses to the final quantitative question, How did you prepare for this experience?”
×
Table 1.
Post–CPE Student Feedback (N = 42)
Post–CPE Student Feedback (N = 42)×
StatementStudents Who Strongly Agreed or Agreed
n%
I valued using a simulated patient.a3788
I valued the participation of the nonfaculty occupational therapist.a,b3995
I valued the use of video for self-reflection.b3175
Participation in a timed practical increased my awareness of the importance of clinical time management skills.a,b3585
Participation in the CPE helped me prepare for Level II fieldwork.3993
Participation in the CPE increased my confidence level as an occupational therapy student.a3071
I would recommend continuation of the CPE in future.a3483
Table Footer NoteNote. CPE = comprehensive practical exam.
Note. CPE = comprehensive practical exam.×
Table Footer NoteaQuestions asked of the class of 2014 only.
Questions asked of the class of 2014 only.×
Table Footer NotebN = 41.
N = 41.×
Table 1.
Post–CPE Student Feedback (N = 42)
Post–CPE Student Feedback (N = 42)×
StatementStudents Who Strongly Agreed or Agreed
n%
I valued using a simulated patient.a3788
I valued the participation of the nonfaculty occupational therapist.a,b3995
I valued the use of video for self-reflection.b3175
Participation in a timed practical increased my awareness of the importance of clinical time management skills.a,b3585
Participation in the CPE helped me prepare for Level II fieldwork.3993
Participation in the CPE increased my confidence level as an occupational therapy student.a3071
I would recommend continuation of the CPE in future.a3483
Table Footer NoteNote. CPE = comprehensive practical exam.
Note. CPE = comprehensive practical exam.×
Table Footer NoteaQuestions asked of the class of 2014 only.
Questions asked of the class of 2014 only.×
Table Footer NotebN = 41.
N = 41.×
×