|Year : 2013 | Volume
| Issue : 6 | Page : 587-592
Development of a teaching module for parenteral drug administration and objective structured practical examination stations in pharmacology
Vasudha Devi1, Prachitee Upadhye1, Pradhum Ram2, Ritesh G Menezes3
1 Department of Pharmacology, Melaka Manipal Medical College, Manipal Campus, Manipal University, Manipal, India
2 Department of Pharmacology, Undergraduate Medical Student, Kasturba Medical College, Mangalore, India
3 Department of Forensic Medicine and Toxicology, ESIC Medical College and PGIMSR, Bangalore, Karnataka, India
|Date of Submission||19-May-2013|
|Date of Decision||11-Jul-2013|
|Date of Acceptance||14-Aug-2013|
|Date of Web Publication||14-Nov-2013|
Department of Pharmacology, Melaka Manipal Medical College, Manipal Campus, Manipal University, Manipal
Source of Support: None, Conflict of Interest: None
Objectives: Safe parenteral drug administration includes preparation of safe medication for administration. Training medical students is crucial to minimize medication administration errors. The study aims to develop a module to teach drug preparation skills and to develop objective structured practical examination (OSPE) stations to assess these skills. Students' perceptions regarding the module were also assessed.
Materials and Methods: A module was developed to teach following skills to 2 nd year medical students: Aspiration of a drug from the ampule, aspiration of the drug from the vial, aspiration of the drug in powdered form from vial (reconstitution), and setting up an intravenous (IV) infusion. A randomized case control study design was used to establish the validity of OSPE stations. Student volunteers were grouped into case (n = 20) and control groups (n = 20) by simple randomization. The test group watched videos of skills and received demonstration of skills and a practice session before OSPE, whereas the control group watched videos before the OSPE and received demonstration and a practice session only after the OSPE. Each student was assessed by two faculty members during OSPE using a validated checklist. Mean OSPE scores of control and test groups were compared using independent samples t-test. Interrater reliability and concurrent validity of stations were analyzed using interclass correlation coefficient (ICC) and Pearson correlation, respectively. Students' responses were expressed as median and interquartile range.
Results: The response rate in the questionnaire was 100%. Significant difference between mean scores (P < 0.05) of test and control groups revealed fulfillment of construct validity of OSPE stations. Interrater reliability (ICC > 0.7) and concurrent validity (r value > 7) of all the stations was high. Perceptions revealed acceptability of module and OSPE stations by students (median 4, scale 1-5).
Conclusions: A module to teach drug preparation skills was developed and along with valid and reliable OSPE stations that were acceptable to students. The study demonstrated that students acquire better skills through teaching than merely watching these skills in videos.
Keywords: Drug preparation skills, feedback, medical student, module, objective structured practical examination
|How to cite this article:|
Devi V, Upadhye P, Ram P, Menezes RG. Development of a teaching module for parenteral drug administration and objective structured practical examination stations in pharmacology. Indian J Pharmacol 2013;45:587-92
|How to cite this URL:|
Devi V, Upadhye P, Ram P, Menezes RG. Development of a teaching module for parenteral drug administration and objective structured practical examination stations in pharmacology. Indian J Pharmacol [serial online] 2013 [cited 2019 Nov 13];45:587-92. Available from: http://www.ijp-online.com/text.asp?2013/45/6/587/121369
| » Introduction|| |
A medication error is taken to be a failure in the course of treatment offered thereby causing direct harm, or having the ability to cause potential harm to the patient.  These errors can happen during any stage of the treatment process right from the basic prescribing, transcribing, and compounding or manufacturing; to dispensing of the drug, administration, and monitoring of therapy offered. Error rates of the medication offered are directly proportional to the inexperience of the healthcare professional offering the medication. 
It comes as no surprise that many such events involve doctors who have graduated in the recent past.  Lack of knowledge of how errors occur, failure to adhere to policy and procedure documents, distractions, lack of knowledge about medication, dosage calculation errors, and workload are certain factors that contribute to medical errors.  Poor prescribing is perhaps the most common cause of the preventable medication error.  Outcome based education focuses on the educational process that is guided by what we hope students to achieve at the end of their training, that is, the outcomes.  Hence to deal with poor prescribing, the current medical curriculum which includes clinical pharmacological lessons concentrates on teaching-learning strategies that gives one sound knowledge of safe and rational prescribing. ,, Though safe medication administration traditionally has been regarded as adhering to five rights (5Rs): Right medication, dose, time, patient, and route; adding a sixth right, the right technique in medication preparation  seems appropriate as medication errors can happen in this stage also. ,,, Furthermore, drug administration error is a major problem causing substantial morbidity and mortality worldwide. , In a study done in India, drug administration errors account for 15.34% of all medication errors.  These errors are significant contributors to the mortality observed with overall medication errors.  Intravenous (IV) drugs are most commonly associated with drug administration errors. ,, Wrong dilutions and wrong IV infusion speed, improper dose/quantity, wrong strength, wrong drug, wrong dosage form, wrong diluents, wrong duration, and wrong time are the various forms of drug administration errors that occur during parenteral administration. , The financial consequences of such errors are tremendous and could in fact directly influence the cost of healthcare.  Hence, training medical students during their initial years is crucial to minimize such medication errors.
Teaching and training medical students in rational prescribing is only part of the approach to protecting patients from medication errors. Though the nurse is considered as vital in medication administration process, studies reveal the poor competency and knowledge of doctors also contributing to medication errors.  Literature survey reveals that training students in safe drug preparation using right technique is least addressed in clinical pharmacology teaching. Though, interns and junior doctor's work under the supervision of seniors, the level of supervision they receive in their workplace is not certain. Training medical students in these skills and imparting knowledge about how errors occur before qualification may reduce related medication error rates after qualification.
In addition to ideal teaching strategies, the outcome based medical curriculum should integrate identifiable and robust assessment that tests whether the knowledge and skills outcomes identified have been achieved. In addition, students' performance gives indirect measure teaching effectiveness. As objective structured practical examination (OSPE) is a student assessment method where a student's competence is evaluated in a comprehensive, consistent and structured manner, with close attention to the objectivity of the assessment,  we decided to assess students using this method.
Hence, the objectives of the study were: 1. To develop a module to teach drug preparation skills. These included aspiration of drug from the ampule, aspiration of the drug from the vial, aspiration of the drug in the powdered form from the vial (reconstitution), and setting up an IV infusion. 2. To develop OSPE stations to assess students in the aforementioned skills. 3. To assess students' perceptions regarding the module. OSPE was also used for providing feedback to students.
| » Materials and Methods|| |
Materials required for the study (ampules, vials, cannula, infusion set, water for injection, dextrose normal saline, needles (21, 23, and 18G), syringes (2, 5, and 10 ml), sodium chloride 100 ml, and ampule cutter) were obtained from the Kasturba Hospital Pharmacy, Manipal, India. Mannequins for demonstrating setting up an IV infusion were obtained from Adam Laboratory, London. Prerecorded videos demonstrating the skills that we intended to teach students were downloaded from the internet. Two faculty members of the pharmacology department and an external expert, a pharmacologist teaching in another medical college reviewed videos for face and content validity.
A questionnaire was developed to assess students' perceptions regarding the newly developed module and OSPE stations and conduct of the OSPE. The items were identified after a literature search to obtain quality of instructions and organization, the quality of performance, authenticity, and transparency of OSPE.  Two faculty members from the first author's institution; one who is a content expert in pharmacology and another trained at Foundation for Advancement of Medical Education and Research (FAIMER) with an experience in validation methods, checked the content validity of the questionnaire individually. The questionnaire was further refined using a Delphi process in a series of meetings with a panel having above experts.  Students were asked to respond to the questionnaire in 5-point Likert scale after undertaking OSPE and having known their performance through OSPE checklist marked by the examiner.
This randomized case-control study [Figure 1] was carried out at the Department of Pharmacology, Melaka Manipal Medical College, Manipal, India during August 2011 after obtaining approval from the Ethics Committee of Kasturba Hospital, Manipal.
|Figure 1: Study procedure to develop and validate a module to teach drug administration skills|
Click here to view
Forty undergraduate medical students in the 2 nd professional year of their course were recruited for the study after obtaining written informed consent.
Simple randomization procedure was used to allocate students in test (n = 20) and control (n = 20) groups [Figure 1]. To begin with all students were shown videos of drug preparation skills (aspirating drug from an ampule (skill 1), aspirating drug from a vial (skill 2), and aspirating drug from the vial containing a drug in powdered form (skill 3), and setting up an IV infusion (skill 4).
Later students from the test group received a detailed demonstration of all the aforementioned drug preparation skills from the first author, whereas control group received the detailed demonstration only after assessment of their skills in OSPE. After the demonstration, students of the test group were allowed to practice under the supervision of the same faculty (first author) following which the constructive feedback was given regarding their performance. Students from the test group were also given handouts of the procedure prepared using "A World Health Organization (WHO) guide to good prescribing".  The evidence for face and content validity of handouts was collected from the two subject experts. After 4 days of training, students were called for OSPE, during which they were assessed using a newly developed OSPE checklist. After the OSPE students were provided with the checklist and later students' were asked to give their perspectives regarding the module and the conduct of OSPE in the preformed questionnaire after having known their performance in OSPE stations.
Whereas, students in the control group were called for OSPE after 4 days of watching videos demonstrating drug preparation skills. After the assessment, the students received training in all four skills as described for the test group. Perspectives regarding the module and the conduct of OSPE were obtained from the control group using the same questionnaire.
Development of OSPE Stations
OSPE checklists for skills were prepared by using "A WHO guide to good prescribing". 
Evidence was gathered for face and content validity of checklists. The checklist was reviewed by a content expert in pharmacology, by a clinician and also by a pharmacologist who had completed Foundation for Advancement in International Educational and Research (FAIMER) fellowship. Based on their feedback checklists were suitably revised and finalized by consensus.
Four OSPE stations were developed with following instructions:
OSPE 1: Aspirate 1 ml of drug from an ampule for the intramuscular route of administration.
OSPE 2: Aspirate 3 ml of the drug from the vial for injecting through an already placed IV line.
OSPE 3: Aspirate 2 ml of the drug from the vial containing drug in the powder form for the intramuscular injection to the gluteus maximus.
OSPE 4: Set up an IV infusion for the patient with indwelling cannula.
In all the stations, different sizes of needles and syringes of different capacity were kept to check the knowledge of the students regarding the selection of appropriate needles and syringes. The time allotted for the first two OSPE stations was 4 minutes whereas for next 2 stations it was 6 min. OSPE 1 and 2 carried maximum of 16 marks each; whereas, OSPE 3 and 4 carried 20 and 28 marks, respectively. A pilot testing of stations was done, where two faculty members acted as assesse and one of the faculties assessed their skills. Based on the feedback of assesse and assessor during pilot testing, suitable changes were made in OSPE stations and checklist.
Establishing validity of OSPE stations
Construct validity was tested by comparing the OSPE scores of those who are expected to have less skills (control group) and students with more skills (test group). If the OSPE measures students' skills accurately then there should be significant differences in the performances between students of test and control groups. Students of control and test groups were called for OSPE after 4 days of day 1 [Figure 1]. The 4-day delay was to avoid an instant recall bias in the student performance. Each student was assessed by two examiners in every station and student's mean score was used for comparing. The assessment was done by the first and second authors. Concurrent validity was performed by correlating one station OSPE score with that of other stations. The interrater reliability was tested by comparing scores of two examiners in each station.
The data were analyzed using Statistical Package for Social Sciences (SPSS), version 16, statistical analysis program (SPSS, Inc., Chicago, IL). Mean OSPE scores of control and test groups were expressed as mean ± standard deviation and compared using independent samples t-test. A P-value of <0.05 was considered as statistically significant. Interrater reliability was analyzed using interclass correlation coefficient (ICC). Concurrent validity of each station was done by correlating each OSPE station score with the other OSPE scores using Pearson correlation. Students' perception regarding each of the items in the questionnaire was first expressed as median and interquartile range. Frequency analysis of responses to each item in the questionnaire was done and expressed as a cumulative percentage of 'agree' and 'strongly agree' responses.
| » Results|| |
The response rate in the questionnaire was 100% (40/40). [Table 1] shows the students' perceptions regarding the teaching module and OSPE. The median score of majority of the items related to perception was four reflecting students' satisfaction regarding the training and assessment. However, only 55% of the students felt that they were confident of performing these skills in the future. Item no. 3, 4, and 12 were negatively worded and had a median score of 3. Seventy percent of the students opined that OSPE was stressful than other methods of assessment (item no. 13).
|Table 1: Students' perception (median and interquartile range (IQR)) regarding the module and objective structured practical examination (OSPE) for drug administration skills|
Click here to view
There was a statistically significant difference in scores of 'test' and 'control' groups (P ≤ 0.001) in all four skills [Table 2]. This reflects the construct validity of OSPE stations. Interrater reliability of all the stations was high (ICC > 0.7) denoting the reliability of all the stations [Table 3]. The concurrent validity of the station was also high, denoting the marks obtained by each student was highly correlated with the marks obtained in other stations [Figure 2].
|Figure 2: Correlation (r value) of each skill station score with other station scores. *r value >7 between each pair, highly correlating.|
Click here to view
|Table 2: Comparison of average mean objective structured practical examination (OSPE) scores of test and control groups|
Click here to view
|Table 3: Correlation of marks allotted by examiner 1 and 2 in each OSPE station to assess drug administration skills|
Click here to view
| » Discussion|| |
According to the experts, the new module was worth teaching and the module and OSPE stations included material that is considered important to teach and assess, respectively. This establishes face validity.  Content validity refers to how much the OSPE covers the areas of competency. In our study, the experts felt that, OSPE stations and checklist were aligned with the intended learning outcomes of the module, thus content validity  of the OSPE stations and checklist was established. The significant difference between the scores of control and test groups implies that criteria for construct validity were fulfilled. While establishing construct validity, students with no practical skills would be expected to score significantly lower than those with considerable experience.  In our study, before OSPE, the control group was exposed only to videos demonstrating the skills. In the absence of training in these skills in medical school, we assumed that, in current scenario of advanced technology, students may acquire these skills through watching videos in the internet which are available freely. This study demonstrated that, students acquire better skills through teaching than merely watching these skills on videos.
As expected, the score of individual station showed high correlation with the other station scores, establishing concurrent validity. Students who performed better in one station performed similarly in rest of the stations and vice versa. Ideally, concurrent validity compares the assessment of performance of a task by the OSPE to the assessment of the same task by the best existing external measure available.  We did not follow this, as the best possible alternate method was not available to assess the students in skills taught. The high interrater reliability indicates that all four stations were a reliable measure between different examiners. 
The students' perception regarding the module and OSPE were positive reflecting that majority of the students were satisfied with the module and OSPE was acceptable to the students. However, the students did not agree that OSPE was less stressful compared to other types of assessments. Similar findings were noted by the other studies that had assessed perception of students' ona similar examination.  In our study, the students perceived OSPE stressful probably because they were facing it for the first time or could be because of inadequate preparation. Only few students were confident of performing these skills in the future while rest were not. More practice and mastering of these skills may make them more confident in performing these skills. Mixed method of research with focus group discussion with faculty and students could have given more insights to students perceptions generated through the questionnaire.
In our study, the students felt that they had been given desired score and checklist highlighted their areas of strength and weakness. This shows that, OSPE checklists when designed systematically and provided to students act as a tool to provide feedback.
In our study, concurrent validity of OSPE could have been generated by comparing students' score with the score of other types of assessment. However, this study was completely anonymous. Hence, it was impossible to correlate other examination scores. The further evidence for the reliability could have been generated using repeated scores performed by each examiner using videotapes of OSPE.
Two approaches, namely person-centered and system-centered may minimize drug administration errors. Person-centered approaches focus on the individual who makes the error; doctors, nurses, and pharmacists.  Training medical students in drug preparation skills and imparting the knowledge regarding how medication errors occur is one of the person-centered approaches that possibly minimizes medication errors by future doctors.
This study led to the development of a module to teach drug preparation skills that is acceptable to the students. Our study also contributed to the development of valid and reliable OSPE stations to test the same. The checklist developed to assess students in OSPE also acted as a means of providing feedback to students on performance.
| » Acknowledgement|| |
We thank faculty members of Department of Pharmacology and Ms. Sheetal Mohan of Department of Biochemistry, MMMC, Manipal Campus, India for helping us during the development of OSPE stations. Special thanks to students who took part in the study. We also acknowledge the support of nonteaching staff of our clinical skills lab.
| » References|| |
|1.||Aronson JK. Medication errors: What they are, how they happen, and how to avoid them. Q J M 2009;102:513-21. |
|2.||McDowell SE, Ferner HS, Ferner RE. The pathophysiology of medication errors: How and where they arise. Br J Clin Pharmacol 2009;67:605-13. |
|3.||Likic R, Maxwell SR. Prevention of medication errors: Teaching and training. Br J Clin Pharmacol 2009;67:656-61. |
|4.||McBride-Henry K, Foureur M. Medication administration errors: Understanding the issues. Aust J Adv Nurs 2006;23:33-41. |
|5.||Lai NM, Ramesh JC. The product of outcome-based undergraduate medical education: Competencies and readiness for internship. Singapore Med J 2006;47:1053-62. |
|6.||Ross S, Loke Y. Do educational interventions improve prescribing by medical students and junior doctors? A systematic review. Br J Clin Pharmacol 2009;67:662-70. |
|7.||Devi V. Teaching P-drug selection: Experiences from a medical school in India. Int J Pharmacol Clin Sci 2012;1:9-14. |
|8.||Stein HG. Glass ampules and filter needles: An example of implementing the sixth ′r′ in medication administration. Medsurg Nurs 2006;15:290-4. |
|9.||Gaur S, Sinha AK, Srivastava B. Medication errors in medicine wards in a tertiary care teaching hospital of a hill state in India. Asian J Pharm Life Sci 2012;2:56-63. |
|10.||Barker KN, Flynn EA, Pepper GA, Bates DW, Mikeal RL. Medication errors observed in 36 health care facilities. Arch Intern Med 2002;162:1897-903. |
|11.||Rothschild JM, Keohane CA, Cook EF, Orav EJ, Burdick E, Thompson S, et al. A controlled trial of smart infusionpumps to improve medication safety in critically ill patients. Crit Care Med 2005;33:533-40. |
|12.||Chua SS, Tea MH, Rahman MH. An observational study of drug administration errors in a Malaysian hospital (study of drug administration errors). J Clin Pharm Ther 2009;34:215-23. |
|13.||Kumar KS, Venkateswarlu K, Ramesh A. A Study of medication administration errors in a tertiary care hospital. Indian J Pharm Pract 2011;4:37-42. |
|14.||Kohn LT, Corrigan JM, Donaldson MS. To err is human: Building a safer health care system. Washington DC: National Academy Press; 1999.p.18-20. |
|15.||McDowell SE, Ferner HS, Ferner RE. The pathophysiology of medication errors: How and where they arise. Br J Clin Pharmacol 2009;67:605-13. |
|16.||Rodriguez-Gonzalez CG, Herranz-Alonso A, Martin-Barbero ML, Duran-Garcia E, Durango-Limarquez MI, Hernández-Sampelayo P, et al. Prevalence of medication administration errors in two medical units with automated prescription and dispensing. J Am Med Inform Assoc 2012;19:72-8. |
|17.||Taneja WCN, Wiegmann DA. The role of perception in medication errors: Implications for non-technological interventions. Med J Armed Forces India 2004;60:172-6. |
|18.||Harden RM, Cairncross RG. The assessment of practical skills: The objective structured practical examination (OSPE). Stud High Educ 1980;5:187-96. |
|19.||El-Nemer AM, Kandeel N. Using OSCE as an assessment tool for clinical skills: Nursing students′ feedback. Med J Cairo Univ 2009;77:457-64. |
|20.||Chia-Chien H, Brian AS. The Delphi technique: Making sense of consensus. Practical assessment research and evaluation 2007;12:8. Available from: http:// pareonline.net/pdf/v12n10.pdf [Last cited on 2012 Dec 07]. |
|21.||De Vries TP, Henning RH, Hogerzeil HV, Fresle DF. Guide to good prescribing: A practical manual. Geneva: World Health Organization, Action Programme on Essential Drugs; 1994. p. 70-5. |
|22.||Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singapore 2005;34:478-82. |
|23.||Shehmar M, Cruikshank M, Finn C, Redman C, Fraser I, Peilee E. A validity study of the national UK colposcopy objective structured clinical examination - is it a test fit for purpose? BJOG 2009;116:1796-9. |
|24.||Raj N, Badcock JL, Brown GA, Deighton CM, O′reilly SC. Design and validation of 2 objective structured clinical examination stations to assess core undergraduate examination skills of the hand and knee. J Rheumatol 2007;34:421-4. |
|25.||Pierre RB, Wierenga A. Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ 2004;4:7. Available from: http://www.biomedcentral.com/content/pdf/1472-6920-4-22.pdf [Last cited on 2012 Sep] |
[Figure 1], [Figure 2]
[Table 1], [Table 2], [Table 3]
|This article has been cited by|
||Objective Structured Practical Examination and Conventional Practical Examination: a Comparison of Scores
| ||Om Lata Bhagat,Bharti Bhandari,Bharati Mehta,Sabyasachi Sircar |
| ||Medical Science Educator. 2014; |
|[Pubmed] | [DOI]|