Introduction
Traditional procedural training with heavier focus on factual knowledge and lower attentions to skill training can lead to graduates with poor procedural competence.1 The use of simulation-based training strengthens up students’ clinical skills and practices and results in a more meaningful learning experience. Competence based assessment are the measures of what doctors do in testing or controlled situations while performance based assessments are defined as measures of what doctors do in real practice.2 Assessment of a student’s actual performance in the labour room poses a real challenge for teachers. Assessment should balance both the issues of validity and reliability. In 1998, the Accreditation Council of Graduate Medical Education (ACGME) began an initiative, called the Outcome Project, which fostered residency training with a focus on development and assessment of the six competencies, including medical knowledge, patient care, interpersonal and communication skills, systems-based practice, professionalism, and practice-based learning and improvement.1 Among the assessment tools targeted on various competencies evolving for years, the direct observation at workplace has played an important role in the process of these educational reforms.3 As proven by many studies that providing feedback to the students is most influential factor for their learning and achievement.
The key features of DOPS include assessment of procedural skills, evaluation of a specific patient encounter, performance of procedure on actual patient, immediate feedback on performance. The data and feedback enable the learner to assess themselves against important criteria as they learn to perform specific procedures. DOPS is generally led by the trainee i.e trainee chooses the procedure, timing and supervisor In USA, the assessment of residents, and increasingly of students as well, is largely based on a model that was developed by the Accreditation Council for Graduate Medical Education (ACGME).4
Rationale of the Study
Most medical students start their careers as qualified doctors after successfully completing the final high stakes examinations. Traditionally, doctors have been regarded as competent enough to start working with patients immediately. Moreover, the relationship between demonstrating competency in examinations and behaviour in actual practice appears at the least to be problematic. It is now known that merely undertaking postgraduate courses throughout a professional career, even if done from personal initiative, is not enough to remain working as a ‘competent’ doctor.4 This However, the perspectives of patients and society demand that doctors should meet the assessment standards in their working conditions in any given situation. In the future, the emphasis should lie on the assessment of performance.
With the hypothesis that performance is also achieved apart from competence after a simulation based training of postgraduates of obstetrics for conducting vaginal delivery we have planned this study.
Materials and Methods
This study was carried out on fourteen postgraduates of department of Obstetrics and Gynecology for a period of 6 months from March 2016 to September 2016. An approval was sought from the institutional review board (IRB) before starting the study. Informed consent was taken from the residents before inclusion in the study. This study was post test only control group design. The study populations were included based on convenient sampling amongst the residents. Out of the fourteen residents who consent to be included in the study seven were then randomly assigned group 1 and the other seven were assigned group 2. Group 1 was the simulation based training group and group 2 underwent the conventional training in labour room. All residents who gave consent were included in the study.
Study group
The postgraduates were exposed to ideal method of normal vaginal delivery on a birthing simulator. The birthing simulators used were Mama Natalie manufactured by Laerdal and S550 Noelle Maternal and Neonatal Stimulation System. The facilitator coached them on ideal method of delivery. Normal delivery was taught using a learning guide. All aspects of normal labour were sub divided into various parts and management of second and third stage, care of newborn counseling and disinfection was taught. This coaching took place over 5 days and was the sessions were for1 hour each. They were allowed to practice and refine their skill on models. The control group post graduates were not coached using a simulator. They were allowed to learn the skill of conducting of vaginal delivery by observation by traditional method in labour room. The competence of both the groups was assessed through OSCE by demonstrating the procedure of normal delivery on a mannequin after a gap of 15 days. They were marked using a checklist by a competent faculty member other than the investigator.
Data collection
5 OSCE stations were set up
Station-getting ready for a normal delivery
Station-conducting normal delivery
3station-essential newborn care
Station-active management of third stage
Station –infection prevention
Following this the individual performance of both groups was assessed after 1 month by DOPS (direct observation of procedural skills) on patients actually delivering in labour room using the standardized DOPS assessment proforma for vaginal delivery. This assessment was done by yet another independent observer.
This structured assessment sheet was made after discussion with senior experienced faculty of the department. The weightage of each component of case presentation was decided and allotted marks accordingly. The structured assessment sheet was validated by the nodal centre before using.
For ethical considerations the control group II was crossed over and exposed to simulation based training.
Results
The total number of residents participated in the study were 14. Seven were in the study arm and seven in the control arm.
The comparison between the scores of the postgraduates in 5 OSCE station in between the study group who were exposed to simulation-based teaching and control group who were not exposed is shown in Table 1. The mean scores of students of study group are better than control group. The Student’s unpaired t test was used to compare OSCE scores in study group and control group.
OSCE score in two groups were compared using 5 stations. The comparison between the OSCE scores of study group and control group is shown in Table 1. It shows a higher mean DOPS score in study group at all stations as compared to control group.
The mean difference between scores of study group and control group is shown in Table 2. This was statistically significant for all 5 OSCE stations (p value less than 0.05) suggesting a significant change after simulation-based training.
The comparison is shown between the scores of the 11 subskills assessed in between the study group and control group using DOPS as method of assessment in Table 3. Overall score of study group is better than control group for all subskills. Student’s unpaired t test was used to compare DOPS Scores of study group and control group.
Table 4 shows that the mean difference in DOPS score between the study group and control group was statistically significant for all 11 skills (p value less than 0.05) with overall performance in DOPS in study group being significantly better than control group (p< 0.05).
Table 1
Table 2
Table 3
Table 4
Discussion
Recent advances in technology have positioned simulations as a powerful tool for creating more realistic, experiential learning environments and thereby helping organizations meet these emerging training challenges.5 Several studies regarding simulation-based training revealed its potential benefits to serve as an alternative tool to real clinical practice among students and medical professionals (Bell and Kozlowski 2007).6 Simulation techniques can be employed to enhance learning of healthcare professionals in safe environments, without compromising the patient safety, while maintaining a high degree of realism (Khan et al., 2011).7 In obstetrics in particular, simulation training may hold significant benefits for the training of medical students and residents, who not only face strict work-hour limitations, but also the emotionally charged labor and delivery ward where it is difficult and often awkward to teach during labor with expectant parents awaiting the birth of their child (Macedonia et al., 2003).8 Obstetrics simulators have been used to teach rare and catastrophic events to improve patient safety and improve the competency of the learners (Holmström et al., 2011).9
In a study performed by Omer and Muhammad in 2017,10 the mean OSCE score was statistically significantly higher in the study group who had undergone clinical teaching as compared to controls (62.36 vs. 47.94, p < 0.001) which is similar to our study.
Dumont and Hakim performed a similar study in 2015 in which residents completed the simulation curriculum and the mean OSCE score before the simulation curriculum and mean score after the curriculum was tested and result was 54.6% (20.5 of 37) and 78.1% (28.9 of 37; P < .001) respectively which is clinically significant.11
A simulation based medical education carried out by Shah and Baig to assess the effectiveness of medium fidelity simulator in teaching normal vaginal delivery to medical students as compared to traditional method, simulation-based skill learning showed significantly better results (mean score of 8.9 compared to group A which had mean of 5.67 p<0.01).12
Based on this, the Clinical Simulation Skill Center (CSSC) at KAU obtained a vaginal delivery simulator (NOELLE) aiming to enhance the mode of teaching experience for both undergraduates and post-graduates. It also aimed to provide an alternative, active and safe method of learning instead of the passive one which depends on just observing real labor at the labor room. This simulator was designed to provide a complete birthing experience before, during and after delivery. In our study also the postgraduates reported that simulation based teaching helped them to obtain more confidence and self-esteem when they had to actually perform on patients.
Although studies describing the efficacy of these models are limited, available evidence suggests that training novices with these models results in better overall performance (Deering et al 2006).13 In a recent study conducted at Saudi Arabia, simulation was found to be effective in teaching procedural skills, diagnostic skills, communication skills, developing self-confidence and provides a safe and effective platform for practice without real harm (Nuzhat et al., 2014).14
Evaluation refers to the judgment or interpretation of those data as they relate to the utility of a curriculum. Although simulation will likely play an increasingly important role in competency assessment over time, the direct observation of learners providing care will remain a cornerstone of assessment and evaluation process. As Carraccio and colleagues (2002)15 have noted, competency-based education and training requires greater involvement by faculty because of the need for direct observation and increased frequency and quality of formative assessment.
Assessment of a student’s actual performance in the wards or in the consulting rooms poses a real challenge for teachers. Increasing attention is being placed on this type of assessment (the highest level of Miller’s pyramid) because of its possible high consequential and predictive validity. Attempts at performance assessment have to balance issues of validity and reliability.
DOPS is a newer workplace-based assessment tools structured to provide useful feedback to trainees and trainers. The trainee-led programs encompass the assessment of knowledge, attitudes, behavior and learned skills during day-to-day surgical practice. Direct Observation of Procedural Skills (DOPS) is the most commonly used workplace assessment instrument. DOPS was formally introduced in 2005, when it was piloted by the United Kingdom Foundation Programme.16 The Intercollegiate Surgical Curriculum Programme (ISCP) has encouraged the use of surgical DOPS, along with other assessment tools, for evaluation of surgical trainees due to its clear and user-friendly format and its applicability to clinical, patient-based situations.
A study from the University of Toronto, Canada found that direct observation and evaluation of competence in clinical procedures is not routinely undertaken by educational supervisors.17 This void intraining evaluation can be filled with the use of surgical DOPS as an assessment instrument.
DOPS is a highly structured tool, which is most applicable in assessing the mechanistic technicalities of procedural skills. An alternative to DOPS, focusing on assessing history taking and patient interaction skills may potentially be the global ratings scale.18 A structured form of evaluation is preferable to other crude measures of assessment as structured evaluations result in outcomes that are more reliable and the assessments are more effective.19, 20
Several studies have found a lack of rigorous testing of procedural skills.21 To address this deficiency, DOPS is designed to assess the procedural skills of surgical, medical or general practice trainees at all levels.
A drawback of DOPS is that it evaluates a specific encounter, which may not be representative of a trainees overall performance, rather than rating based on assessment over a longer period of time and that specific encounter.22
The two cardinal components of WPBA are ‘direct observation’ and ‘conducted in workplace.’ In our study we have seen that OSCE is one of the best assessment tools for competence achieved and DOPS for translation of that competence into actual performance in real life settings. The feedback given by the assessor the students is the great boon of this tool. The feasibility of DOPS may even be better than the traditional assessment methods as it is carried out during the course of routine work. Though it requires initial faculty training, some extra time and student sensitization, there is hardly any requirement for additional infrastructure. In India, where clinical work is abundant and most trainees are actually overburdened with work, this may be the most appropriate developmental learning modality. And above that the educational impact of the WPBA is high on account of its being based on developmental and contextual feedback. Some of the WPBA tools such as the mini-CEX, DOPS or tools similar to those described by Singh T et al, Kapoor H et al, Butterworth K et al and Ravishankar L et al.23, 24, 25 have reported them to be encouraging in terms of acceptance by faculty and students and feasibility as well .
Conclusion
In summary, simulation based training of residents helped in achieving performance over competence in real life situation. They could not only perform well in controlled situations, but they also performed well on patients. The goal of bridging the gap between the classroom and clinical environment was fulfilled by utilizing birth simulators in teaching vaginal delivery among postgraduates. Truly it has been beneficial for their learning and satisfaction.
Recommendation
It is recommended that simulation based teaching should be extended to acquisition of other basic skills in the curriculum of post graduate training. At the same time simulation based teaching should be introduced in the undergraduate curriculum. Also OSCE should be used as assessment tool for competence and DOPS for performance
Strengths of Study
Direct observation of procedural skills exercise was introduced in the department of Obstetrics and Gynecology after proper training of assessors to score the residents on the checklist and give appropriate, constructive feedback. The residents were also trained via a workshop to use the feedback, do reflective practice and learn clinical skills constructively. The entire faculty in the department was involved in the project and the residents enthusiastically participated in the assessment. Thus, this educational research was able to bring about a positive change in the learning environment of the residency program