• Article highlight
  • Article tables
  • Article images

Article History

Received : 31-03-2022

Accepted : 29-04-2022



Article Metrics




Downlaod Files

   


Article Access statistics

Viewed: 365

PDF Downloaded: 266


Get Permission Jain, Jagzape, Ganorkar, Ujwal, and Jain: Assessment of performance over competence after a simulation-based training among post graduates of obstetrics


Introduction

Traditional procedural training with heavier focus on factual knowledge and lower attentions to skill training can lead to graduates with poor procedural competence.1 The use of simulation-based training strengthens up students’ clinical skills and practices and results in a more meaningful learning experience. Competence based assessment are the measures of what doctors do in testing or controlled situations while performance based assessments are defined as measures of what doctors do in real practice.2 Assessment of a student’s actual performance in the labour room poses a real challenge for teachers. Assessment should balance both the issues of validity and reliability. In 1998, the Accreditation Council of Graduate Medical Education (ACGME) began an initiative, called the Outcome Project, which fostered residency training with a focus on development and assessment of the six competencies, including medical knowledge, patient care, interpersonal and communication skills, systems-based practice, professionalism, and practice-based learning and improvement.1 Among the assessment tools targeted on various competencies evolving for years, the direct observation at workplace has played an important role in the process of these educational reforms.3 As proven by many studies that providing feedback to the students is most influential factor for their learning and achievement.

The key features of DOPS include assessment of procedural skills, evaluation of a specific patient encounter, performance of procedure on actual patient, immediate feedback on performance. The data and feedback enable the learner to assess themselves against important criteria as they learn to perform specific procedures. DOPS is generally led by the trainee i.e trainee chooses the procedure, timing and supervisor In USA, the assessment of residents, and increasingly of students as well, is largely based on a model that was developed by the Accreditation Council for Graduate Medical Education (ACGME).4

Rationale of the Study

Most medical students start their careers as qualified doctors after successfully completing the final high stakes examinations. Traditionally, doctors have been regarded as competent enough to start working with patients immediately. Moreover, the relationship between demonstrating competency in examinations and behaviour in actual practice appears at the least to be problematic. It is now known that merely undertaking postgraduate courses throughout a professional career, even if done from personal initiative, is not enough to remain working as a ‘competent’ doctor.4 This However, the perspectives of patients and society demand that doctors should meet the assessment standards in their working conditions in any given situation. In the future, the emphasis should lie on the assessment of performance.

With the hypothesis that performance is also achieved apart from competence after a simulation based training of postgraduates of obstetrics for conducting vaginal delivery we have planned this study.

Materials and Methods

This study was carried out on fourteen postgraduates of department of Obstetrics and Gynecology for a period of 6 months from March 2016 to September 2016. An approval was sought from the institutional review board (IRB) before starting the study. Informed consent was taken from the residents before inclusion in the study. This study was post test only control group design. The study populations were included based on convenient sampling amongst the residents. Out of the fourteen residents who consent to be included in the study seven were then randomly assigned group 1 and the other seven were assigned group 2. Group 1 was the simulation based training group and group 2 underwent the conventional training in labour room. All residents who gave consent were included in the study.

Figure 1

Study process and flow chart

https://typeset-prod-media-server.s3.amazonaws.com/article_uploads/50531e43-79dc-4d90-93b1-04711b4c69f8/image/6a0cdac5-9eae-44e2-a9af-3fcc8d80234e-uimage.png

Study group

The postgraduates were exposed to ideal method of normal vaginal delivery on a birthing simulator. The birthing simulators used were Mama Natalie manufactured by Laerdal and S550 Noelle Maternal and Neonatal Stimulation System. The facilitator coached them on ideal method of delivery. Normal delivery was taught using a learning guide. All aspects of normal labour were sub divided into various parts and management of second and third stage, care of newborn counseling and disinfection was taught. This coaching took place over 5 days and was the sessions were for1 hour each. They were allowed to practice and refine their skill on models. The control group post graduates were not coached using a simulator. They were allowed to learn the skill of conducting of vaginal delivery by observation by traditional method in labour room. The competence of both the groups was assessed through OSCE by demonstrating the procedure of normal delivery on a mannequin after a gap of 15 days. They were marked using a checklist by a competent faculty member other than the investigator.

Data collection

5 OSCE stations were set up

  1. Station-getting ready for a normal delivery

  2. Station-conducting normal delivery

  3. 3station-essential newborn care

  4. Station-active management of third stage

  5. Station –infection prevention

Following this the individual performance of both groups was assessed after 1 month by DOPS (direct observation of procedural skills) on patients actually delivering in labour room using the standardized DOPS assessment proforma for vaginal delivery. This assessment was done by yet another independent observer.

This structured assessment sheet was made after discussion with senior experienced faculty of the department. The weightage of each component of case presentation was decided and allotted marks accordingly. The structured assessment sheet was validated by the nodal centre before using.

For ethical considerations the control group II was crossed over and exposed to simulation based training.

Data analysis

Statistical analysis was done by using descriptive and inferential statistics using Student’s unpaired t test and software used in the analysis was SPSS17.0 version and p<0.05 was considered as significant.

Results

The total number of residents participated in the study were 14. Seven were in the study arm and seven in the control arm.

The comparison between the scores of the postgraduates in 5 OSCE station in between the study group who were exposed to simulation-based teaching and control group who were not exposed is shown in Table 1. The mean scores of students of study group are better than control group. The Student’s unpaired t test was used to compare OSCE scores in study group and control group.

OSCE score in two groups were compared using 5 stations. The comparison between the OSCE scores of study group and control group is shown in Table 1. It shows a higher mean DOPS score in study group at all stations as compared to control group.

The mean difference between scores of study group and control group is shown in Table 2. This was statistically significant for all 5 OSCE stations (p value less than 0.05) suggesting a significant change after simulation-based training.

The comparison is shown between the scores of the 11 subskills assessed in between the study group and control group using DOPS as method of assessment in Table 3. Overall score of study group is better than control group for all subskills. Student’s unpaired t test was used to compare DOPS Scores of study group and control group.

Table 4 shows that the mean difference in DOPS score between the study group and control group was statistically significant for all 11 skills (p value less than 0.05) with overall performance in DOPS in study group being significantly better than control group (p< 0.05).

Table 1

Comparison of OSCE score in two groups

Group

N

Mean

Std. Deviation

Std. Error Mean

1st Station

Study

7

10.85

1.34

0.50

Control

7

7.14

0.69

0.26

2nd Station

Study

7

9.42

0.97

0.36

Control

7

7.00

0.81

0.30

3rd Station

Study

7

7.71

1.11

0.42

Control

7

5.28

1.38

0.52

4th Station

Study

7

7.42

0.78

0.29

Control

7

5.57

1.13

0.42

5th Station

Study

7

7.71

1.11

0.42

Control

7

5.28

1.11

0.42

Total

Study

7

43.14

4.84

1.83

Control

7

30.28

4.57

1.72

Table 2

Student’s unpaired t test to compare OSCE scoresin study group and control group

t-test for Equality of Means

95% Confidence Interval of the Difference

T

df

p-value

Mean Difference

Std. Error Difference

Lower

Upper

1st Station

6.50

12

0.0001,S,p<0.05

3.71

0.57

2.46

4.95

2nd Station

5.05

12

0.0001,S,p<0.05

2.42

0.48

1.38

3.47

3rd Station

3.62

12

0.003,S,p<0.05

2.42

0.67

0.96

3.88

4th Station

3.56

12

0.004,S,p<0.05

1.85

0.52

0.72

2.99

5th Station

4.08

12

0.002,S,p<0.05

2.42

0.59

1.13

3.72

Total

5.10

12

0.0001,S,p<0.05

12.85

2.51

7.37

18.34

Table 3

Comparison of DOPS score in study group and control group

Skill

Group

N

Mean

Std. Deviation

Std. Error Mean

Skill 1

Study

7

7.14

0.69

0.26

Control

6

4.00

0.63

0.25

Skill 2

Study

7

7.00

0.81

0.30

Control

6

4.83

0.40

0.16

Skill 3

Study

7

7.14

0.69

0.26

Control

6

4.16

0.75

0.30

Skill 4

Study

7

7.42

0.53

0.20

Control

6

3.66

0.81

0.33

Skill 5

Study

7

7.14

0.69

0.26

Control

6

4.16

0.75

0.30

Skill 6

Study

7

7.14

0.69

0.26

Control

6

4.50

0.83

0.34

Skill 7

Study

7

7.42

0.53

0.20

Control

6

5.00

0.63

0.25

Skill 8

Study

7

7.14

0.69

0.26

Control

6

5.33

0.81

0.33

Skill 9

Study

7

7.71

0.75

0.28

Control

6

5.66

0.51

0.21

Skill 10

Study

7

7.00

0.81

0.30

Control

6

5.83

0.40

0.16

Skill 11

Study

7

7.71

0.75

0.28

Control

6

6.33

0.81

0.33

Total

Study

7

80.00

5.41

2.04

Control

6

53.50

5.68

2.32

Table 4

Student’s unpaired t test to compare DOPS scores of study group and control group

Skill

t-test for Equality of Means

95% Confidence Interval of the Difference

T

df

p-value

Mean Difference

Std. Error Difference

Lower

Upper

Skill 1

8.501

11

0.0001,S,p<0.05

3.14

0.36

2.32

3.95

Skill 2

5.875

11

0.0001,S,p<0.05

2.16

0.36

1.35

2.95

Skill 3

7.438

11

0.0001,S,p<0.05

2.97

0.40

2.09

3.85

Skill 4

9.982

11

0.0001,S,p<0.05

3.76

0.37

2.93

4.59

Skill 5

7.438

11

0.0001,S,p<0.05

2.97

0.40

2.09

3.85

Skill 6

6.249

11

0.0001,S,p<0.05

2.64

0.42

1.71

3.57

Skill 7

7.512

11

0.0001,S,p<0.05

2.42

0.32

1.71

3.14

Skill 8

4.336

11

0.001,S,p<0.05

1.80

0.41

0.89

2.72

Skill 9

5.594

11

0.000,S,p<0.05

2.04

0.36

1.24

2.85

Skill 10

3.164

11

0.009,S,p<0.05

1.16

0.36

0.35

1.97

Skill 11

3.166

11

0.009,S,p<0.05

1.38

0.43

0.42

2.34

Total

8.599

11

0.0001,S,p<0.05

26.50

3.08

19.71

33.28

Discussion

Recent advances in technology have positioned simulations as a powerful tool for creating more realistic, experiential learning environments and thereby helping organizations meet these emerging training challenges.5 Several studies regarding simulation-based training revealed its potential benefits to serve as an alternative tool to real clinical practice among students and medical professionals (Bell and Kozlowski 2007).6 Simulation techniques can be employed to enhance learning of healthcare professionals in safe environments, without compromising the patient safety, while maintaining a high degree of realism (Khan et al., 2011).7 In obstetrics in particular, simulation training may hold significant benefits for the training of medical students and residents, who not only face strict work-hour limitations, but also the emotionally charged labor and delivery ward where it is difficult and often awkward to teach during labor with expectant parents awaiting the birth of their child (Macedonia et al., 2003).8 Obstetrics simulators have been used to teach rare and catastrophic events to improve patient safety and improve the competency of the learners (Holmström et al., 2011).9

In a study performed by Omer and Muhammad in 2017,10 the mean OSCE score was statistically significantly higher in the study group who had undergone clinical teaching as compared to controls (62.36 vs. 47.94, p < 0.001) which is similar to our study.

Dumont and Hakim performed a similar study in 2015 in which residents completed the simulation curriculum and the mean OSCE score before the simulation curriculum and mean score after the curriculum was tested and result was 54.6% (20.5 of 37) and 78.1% (28.9 of 37; P < .001) respectively which is clinically significant.11

A simulation based medical education carried out by Shah and Baig to assess the effectiveness of medium fidelity simulator in teaching normal vaginal delivery to medical students as compared to traditional method, simulation-based skill learning showed significantly better results (mean score of 8.9 compared to group A which had mean of 5.67 p<0.01).12

Based on this, the Clinical Simulation Skill Center (CSSC) at KAU obtained a vaginal delivery simulator (NOELLE) aiming to enhance the mode of teaching experience for both undergraduates and post-graduates. It also aimed to provide an alternative, active and safe method of learning instead of the passive one which depends on just observing real labor at the labor room. This simulator was designed to provide a complete birthing experience before, during and after delivery. In our study also the postgraduates reported that simulation based teaching helped them to obtain more confidence and self-esteem when they had to actually perform on patients.

Although studies describing the efficacy of these models are limited, available evidence suggests that training novices with these models results in better overall performance (Deering et al 2006).13 In a recent study conducted at Saudi Arabia, simulation was found to be effective in teaching procedural skills, diagnostic skills, communication skills, developing self-confidence and provides a safe and effective platform for practice without real harm (Nuzhat et al., 2014).14

Evaluation refers to the judgment or interpretation of those data as they relate to the utility of a curriculum. Although simulation will likely play an increasingly important role in competency assessment over time, the direct observation of learners providing care will remain a cornerstone of assessment and evaluation process. As Carraccio and colleagues (2002)15 have noted, competency-based education and training requires greater involvement by faculty because of the need for direct observation and increased frequency and quality of formative assessment.

Assessment of a student’s actual performance in the wards or in the consulting rooms poses a real challenge for teachers. Increasing attention is being placed on this type of assessment (the highest level of Miller’s pyramid) because of its possible high consequential and predictive validity. Attempts at performance assessment have to balance issues of validity and reliability.

DOPS is a newer workplace-based assessment tools structured to provide useful feedback to trainees and trainers. The trainee-led programs encompass the assessment of knowledge, attitudes, behavior and learned skills during day-to-day surgical practice. Direct Observation of Procedural Skills (DOPS) is the most commonly used workplace assessment instrument. DOPS was formally introduced in 2005, when it was piloted by the United Kingdom Foundation Programme.16 The Intercollegiate Surgical Curriculum Programme (ISCP) has encouraged the use of surgical DOPS, along with other assessment tools, for evaluation of surgical trainees due to its clear and user-friendly format and its applicability to clinical, patient-based situations.

 A study from the University of Toronto, Canada found that direct observation and evaluation of competence in clinical procedures is not routinely undertaken by educational supervisors.17 This void intraining evaluation can be filled with the use of surgical DOPS as an assessment instrument.

DOPS is a highly structured tool, which is most applicable in assessing the mechanistic technicalities of procedural skills. An alternative to DOPS, focusing on assessing history taking and patient interaction skills may potentially be the global ratings scale.18 A structured form of evaluation is preferable to other crude measures of assessment as structured evaluations result in outcomes that are more reliable and the assessments are more effective.19, 20

Several studies have found a lack of rigorous testing of procedural skills.21 To address this deficiency, DOPS is designed to assess the procedural skills of surgical, medical or general practice trainees at all levels.

A drawback of DOPS is that it evaluates a specific encounter, which may not be representative of a trainees overall performance, rather than rating based on assessment over a longer period of time and that specific encounter.22

The two cardinal components of WPBA are ‘direct observation’ and ‘conducted in workplace.’ In our study we have seen that OSCE is one of the best assessment tools for competence achieved and DOPS for translation of that competence into actual performance in real life settings. The feedback given by the assessor the students is the great boon of this tool. The feasibility of DOPS may even be better than the traditional assessment methods as it is carried out during the course of routine work. Though it requires initial faculty training, some extra time and student sensitization, there is hardly any requirement for additional infrastructure. In India, where clinical work is abundant and most trainees are actually overburdened with work, this may be the most appropriate developmental learning modality. And above that the educational impact of the WPBA is high on account of its being based on developmental and contextual feedback. Some of the WPBA tools such as the mini-CEX, DOPS or tools similar to those described by Singh T et al, Kapoor H et al, Butterworth K et al and Ravishankar L et al.23, 24, 25 have reported them to be encouraging in terms of acceptance by faculty and students and feasibility as well .

Conclusion

In summary, simulation based training of residents helped in achieving performance over competence in real life situation. They could not only perform well in controlled situations, but they also performed well on patients. The goal of bridging the gap between the classroom and clinical environment was fulfilled by utilizing birth simulators in teaching vaginal delivery among postgraduates. Truly it has been beneficial for their learning and satisfaction.

Recommendation

It is recommended that simulation based teaching should be extended to acquisition of other basic skills in the curriculum of post graduate training. At the same time simulation based teaching should be introduced in the undergraduate curriculum. Also OSCE should be used as assessment tool for competence and DOPS for performance

Strengths of Study

Direct observation of procedural skills exercise was introduced in the department of Obstetrics and Gynecology after proper training of assessors to score the residents on the checklist and give appropriate, constructive feedback. The residents were also trained via a workshop to use the feedback, do reflective practice and learn clinical skills constructively. The entire faculty in the department was involved in the project and the residents enthusiastically participated in the assessment. Thus, this educational research was able to bring about a positive change in the learning environment of the residency program

Limitations

Several limitations of our study should be noted. Firstly the sample size was small, Secondly, the assessors were not exposed to any feedback training programs though they were sensitized.

Source of Funding

None.

Conflicts of Interest

The authors declare no conflict of interest.

Acknowledgments

My heartfelt Thanks to Dr Tripti Srivastava and all faculty of MCI Nodal Centre, JNMC, Sawangi for their constant support. I thank Prof. P V Shivkumar for allowing me to conduct the research in the department of Obstetrics and Gynecology. My heartfelt thanks to all faculty and residents of Obstetrics and Gynecology. I also thank the principal of nursing school Ms Arti Wasnik and Ms. Kale for providing me their help and allowing me to use their skill lab and simulators for simulation training.

References

1 

WF Iobst J Sherbino CO Ten DL Richardson D Dath SR Swing Competency-based medical education in postgraduate medical educationMedical teacher20103286516

2 

A Samsudeen MM Sulphey Does performance management of medical practitioners stand apart?Int J Pharm Res20181021016

3 

EAM Pelgrim AWM Kramer HGA Mokkink LVD Elsen R Grol CPMVD Vleuten In-training assessment using direct observation of single-patient encounters: a literature reviewAdv Health Sci Educ201116113142

4 

BP Kerfoot ME Mitchell AC Novick Grappling with the evaluation of clinical competencies: a view from the Residency Review Committee for UrologyUrology20026022234

5 

SR Swing The ACGME outcome project: retrospective and prospectiveMed Teach200729764854

6 

BS Bell SWJ Kozlowski Advances in technology-based trainingManaging human resources in North AmericaRoutledge20122743

7 

K Khan T Pattison M Sherwood Simulation in medical educationMed Teach201133113

8 

CR Macedonia RB Gherman AJ Satin Simulation laboratories for training in obstetrics and gynecologyObstet Gynecol2003102238892

9 

SW Holmström K Downes JC Mayer LA Learman Simulation training in an obstetric clerkship: a randomized controlled trialObstet Gynecol2011118364954

10 

BA Hassan OA Elfaki MA Khan The impact of outpatient clinical teaching on students’ academic performance in obstetrics and gynecologyJ Family Community Med20172431969

11 

T Dumont J Hakim A Black N Fleming Does an Advanced Pelvic Simulation Curriculum Improve Resident Performance on a Pediatric and Adolescent Gynecology Focused Objective Structured Clinical Examination? A Cohort StudyJ Pediatr Adolesc Gynecol20162932769

12 

N Shah L Baig N Shah RP Hussain SM Shah Simulation based medical education; teaching normal delivery on intermediate fidelity simulator to medical studentsJ Pak Med Assoc20176710147681

13 

SH Deering JG Hodor M Wylen S Poggi PE Nielsen AJ Satin Additional training with an obstetric simulator improves medical student comfort with basic proceduresSimul Healthc200611324

14 

A Nuzhat R O Salem FN Shehri N Hamdan Role and challenges of simulation in undergraduate curriculumMed Teach201436sup16973

15 

C Carraccio SD Wolfsthal R Englander K Ferentz C Martin Shifting paradigms: from Flexner to competenciesAcad Med20027753617

16 

J Beard A Strachan H Davies F Patterson P Stark S Ball Developing an education and assessment framework for the Foundation ProgrammeMed Educ200539884151

17 

JA Martin G Regehr R Reznick H Macrae J Murnaghan C Hutchison Objective structured assessment of technical skill (OSATS) for surgical residentsJ Br Surg19978422738

18 

G Essers A Kramer B Andriesse CV Weel CVD Vleuten SV Dulmen Context factors in general practitioner-patient encounters and their impact on assessing communication skills-an exploratory studyBMC Fam Pract201314165

19 

A Morris J Hewitt CM Roberts Practical experience of using directly observed procedures, mini clinical evaluation examinations, and peer observation in pre-registration house officer (FY1) traineesPostgrad Med J2006829662858

20 

S Carr The Foundation Programme assessment tools: an opportunity to enhance feedback to trainees? Postgrad Med J20068296657688

21 

RS Sidhu ED Grober LJ Musselman RK Reznick Assessing competency in surgery: where to begin?Surgery20041351620

22 

N Naeem Validity, reliability, feasibility, acceptability and educational impact of direct observation of procedural skills (DOPS)J Coll Physicians Surg Pak20132317782

23 

T Singh M Sharma Mini-clinical examination (CEX) as a tool for formative assessmentNatl Med J India20102321002

24 

H Kapoor A Tekian S Mennin Structuring an internship programme for enhanced learningMed Educ20104455012

25 

T Singh JN Modi Workplace-based assessment: A step to promote competency based postgraduate trainingIndian Pediatr2013505539



jats-html.xsl


This is an Open Access (OA) journal, and articles are distributed under the terms of the Creative Commons Attribution 4.0 International License, which allows others to remix, and build upon the work, the licensor cannot revoke these freedoms as long as you follow the license terms.