IPSIndian Journal of Pharmacology
Home  IPS  Feedback Subscribe Top cited articles Login 
Users Online : 2635 
Small font sizeDefault font sizeIncrease font size
Navigate Here
  Search
 
  
Resource Links
 »  Similar in PUBMED
 »  Search Pubmed for
 »  Search in Google Scholar for
 »Related articles
 »  Article in PDF (678 KB)
 »  Citation Manager
 »  Access Statistics
 »  Reader Comments
 »  Email Alert *
 »  Add to My List *
* Registration required (free)

 
In This Article
 »  Abstract
 » Introduction
 »  Materials and Me...
 » Results
 » Discussion
 » Conclusion
 »  References
 »  Article Figures
 »  Article Tables

 Article Access Statistics
    Viewed1671    
    Printed39    
    Emailed0    
    PDF Downloaded206    
    Comments [Add]    

Recommend this journal

 


 
 Table of Contents    
RESEARCH ARTICLE
Year : 2015  |  Volume : 47  |  Issue : 5  |  Page : 546-550
 

Structured oral examination in pharmacology for undergraduate medical students: Factors influencing its implementation


1 Department of ENT, GMERS Medical College and Hospital, Dharpur, Patan, Gujarat, India
2 Department of Pharmacology, GMERS Medical College and Hospital, Dharpur, Patan, Gujarat, India
3 Department of Pediatrics, GMERS Medical College and Hospital, Dharpur, Patan, Gujarat, India

Date of Submission09-Apr-2015
Date of Decision25-Jun-2015
Date of Acceptance23-Aug-2015
Date of Web Publication15-Sep-2015

Correspondence Address:
Dr. Ajeet Kumar Khilnani
Department of ENT, GMERS Medical College and Hospital, Dharpur, Patan, Gujarat
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/0253-7613.165182

Rights and Permissions

 » Abstract 

Objectives: The study aims to understand the process and factors influencing the implementation of structured oral examination (SOE) for undergraduate medical students; in comparison with conventional oral examination (COE) in pharmacology.
Methods: In a randomized, parallel group study, 123 students of pharmacology were divided into two groups, SOE (n = 63) and COE (n = 60). Students of each group were subdivided into two, and four examiners took viva voce individually. Three sets of questionnaires from autonomic nervous system were prepared, each having 15 items with increasing difficulty levels and were validated by subject experts and pretested. Ten minutes were allotted for each student for each viva. Feedback of students and faculty about the novel method was obtained.
Results: SOE yielded significantly lower marks as compared to COE. There were significant inter-examiner variations in marks awarded in SOE and COE. Other factors influencing implementation were difficulty in structuring viva, rigid time limits, lack of flexibility in knowledge content, monotony, and fatigue. The students perceived this format not different from COE but felt that it required in-depth preparation of topic. Faculty opined that SOE led to less drift from main topic and provided uniform coverage of topics in given time.
Conclusion: Conducting SOE is a resource-intensive exercise. Despite structuring, inter-examiner variability was not completely eliminated. The students' performance was depended on factors related to examiners such as teaching experience, vernacular language used, and lack of training. Orientation and training of examiners in assessment strategies is necessary. Standardization of questionnaire is necessary before the implementation of SOE for summative assessment.


Keywords: Objectivity, pharmacology, reliability, structured oral examination, validity, viva voce


How to cite this article:
Khilnani AK, Charan J, Thaddanee R, Pathak RR, Makwana S, Khilnani G. Structured oral examination in pharmacology for undergraduate medical students: Factors influencing its implementation. Indian J Pharmacol 2015;47:546-50

How to cite this URL:
Khilnani AK, Charan J, Thaddanee R, Pathak RR, Makwana S, Khilnani G. Structured oral examination in pharmacology for undergraduate medical students: Factors influencing its implementation. Indian J Pharmacol [serial online] 2015 [cited 2019 Aug 17];47:546-50. Available from: http://www.ijp-online.com/text.asp?2015/47/5/546/165182



 » Introduction Top


The conventional oral examination (COE) or viva voce is an important format of assessment that allows probing of breadth and depth of the knowledge. Carefully constructed questions can test students in all cognitive domains. The viva is also capable of assessing borderline or exceptional students.[1] Currently, this method is an integral part of formative and summative examinations in various medical universities in India.

COE is criticized for being too subjective and being influenced by academic and nonacademic factors related to teachers and students.[2] It may largely depend on the knowledge, attitude (offering verbal/nonverbal clues and prompting), and mood of examiners. Scores also correlate with personality scores.[3] The process related factors are leniency, central tendency, "Halo effect," and error of contrast.[4] Studies have shown that scores are directly proportional to the number of words spoken by examiner and time taken by him.[5] Another pitfall of viva voce is that unequal distribution of time, that is, initially appearing students may be asked greater number of questions but as time passes, an element of fatigue ensues in examiners, and thus students giving viva in last get much less time. The outcome of this is that award of marks is based on just 2–3 questions asked. This adds an element of uncertainty and chance. This is because different examiners use a different set of questions with varying difficulty levels.

Student related factors include gender, accent and vocabulary used, and ability to pick nonverbal cues. Candidate's level of anxiety and test environment also determine scores.[6]

These factors make COE less reliable and valid assessment tool for criterion-reference system, where the intention is to ascertain achievement of predetermined level of knowledge. Overall, COE is much less cost-effective and more time-consuming.[7],[8]

Despite above limitations, oral examination has heuristic perspective.[9] It is flexible, driven by student's responses, and tests several aspects of clinical competence and ability to defend the decision in a given clinical situation that cannot be tested by written examinations.[10] It can evaluate the depth of knowledge, application and analyzing capability, ethics and professionalism. It also can examine students' communication skills. Oral examination gives immediate feedback to the student.[11] A correlation is found between viva and written components of final summative examination in pharmacology.[12] Similarly, a longitudinal study of oral practical examination within a medical program revealed internal consistency and reliability of oral examination identifying a positive correlation to in-training examination and faculty evaluation scores.[6]

To overcome the limitations of this useful tool, a few modifications are suggested such as chart-stimulated reviews, models, problem-based learning, objective structuring of clinical and practical examination, targeted viva,[9] and structured oral examination (SOE).[13] The SOE is relatively a new phenomenon and a number of studies, conducted on small groups, have shown it to be reasonably reliable and valid, and both faculty and students show positive perception toward this examination tool.[1],[9] At present, little is known about the implementation of SOE in a large group of 150–250 students in a batch. Because SOE is a resource-intensive and time-consuming exercise, it is of utmost importance to understand the feasibility and process of implementation and factors which determine its implementation in large groups consisting of 150–250 students for pharmacology examination on regular basis.

The present study was designed with the primary objective to understand the process of implementation of structured viva voce examination and compare this method with the COE. Other objectives were to find the suitability of SOE in its present form, compare marks obtained and perceptions of students, and to obtain the feedback of faculty regarding this format.


 » Materials and Methods Top


This study was conducted on the second professional students (140 students) studying at a tertiary care hospital and medical college of North Gujarat. The topic of the oral examination was autonomic nervous system (ANS). The students were informed about the date of the examination 15 days prior to the examination. The marks obtained by the students in this study would not be counted in their formative or summative assessment. The students' participation in the study was informed and voluntary. A written informed consent of participation and approval of Institutional Ethics Committee were obtained. The students were randomly divided into four groups (A–D), each of 35 students.

Fifteen cards each having three questions of same difficulty level were prepared by two examiners who later conducted the structured viva (Examiners B and D). The cards were numbered from one to 15 with increasing level of difficulty. These cards were mailed to five subject experts for comments on the language, order and relevance of the questions, entirety of topic, and any other specific comments. On the basis of comments received, the final questionnaire was prepared. Any discrepancy was resolved by consensus. The final questionnaire was pretested on 10 students and questionnaire was again modified based on the student's feedback.

On the day of examination, proper arrangement was made so that students did not interact with each other before completion of the examination. Group A and C gave the unstructured (traditional) viva to two different examiners (Examiners A [Professor] and Examiner C [Assistant Professor]). Similarly, Groups B and D gave the structured viva to two other examiners (Examiners B and D, both Associate Professors). Ten minutes were allotted for viva of each student. After the completion of the viva, each student was asked to fill a questionnaire (four items) to know their perception. Perception of the faculty was also taken through an in-depth interview.

Descriptive statistics reported were mean, SD, frequency, and percentages. The marks obtained by the students in different viva groups were compared using unpaired t-test, one-way ANOVA, and post-hoc Tukey test. The perception of the students was also compared using Chi-square test. SPSS Statistics for Windows, Version 17.0. Chicago: SPSS Inc was used for analysis.


 » Results Top


There were 29 (Group A) and 31 (Group C) students who appeared in unstructured COE and 31 (Group B) and 32 (Group D) students appeared in SOE. The mean marks obtained by COE were significantly higher than those obtained by SOE [Table 1]. The marks allotted by COE varied from examiner to examiner. The marks allotted by Examiner A (mean = 9.43) were significantly lower as compared to those allotted by Examiner C (mean = 19.00). Similarly, marks allotted by SOE showed a marked inter-examiner variability (mean marks of Examiner B vs. Examiner D were 15.74 and 8.12, respectively). Each set of the questionnaire had 15 questions in order of increasing difficulty. The difficulty levels were similar for both examiners (B and D), and this proves the internal consistency of the structured viva [Table 2] and [Figure 1].
Table 1: Comparison of marks obtained by students in SOE and COE

Click here to view
Table 2: Comparative frequency distribution of responders according to increasing difficulty levels of questions in SOE

Click here to view
Figure 1: Relationship between total marks obtained by students and difficulty level of questions in structured oral examination by Examiners B and D

Click here to view


The students did not feel difficulty in responding to the structured format and over 90% students understood the questions clearly. Furthermore, the students did not feel any difference in this format of viva as compared to their previous viva examinations [Table 3].
Table 3: Students' feedback on SOE and COE (unstructured)

Click here to view


The opinion and views expressed by the teachers who conducted viva voce examination are shown in [Table 4]. They reported that this format led to less drift from the main topic and provided equal coverage of topics in the limited time allotted for students' assessment. Among the limitations, reported were difficulties in structuring viva from the entire syllabus, rigidity of time limits and knowledge content, and monotony for the teachers.
Table 4: Opinion of examiners (n=4) regarding SOE and COE (unstructured)

Click here to view



 » Discussion Top


An assessment tool must be valid, reliable, and objective.[4] Most authors agree that structuring and preplanning viva voce leads to a better validity and reliability of viva as an assessment tool for under graduates [14],[15] and postgraduates.[16] The results of this study show that SOE format is acceptable to students and teachers and has internal consistency (reliability). Structured questionnaire allows allotment of marks according to a predetermined scale. Thus, marks awarded are objective, evidence-based as against overall (subjective) assessment-based award of marks in COE. There was a greater variation in mean marks allotted by two examiners in COE (9.4–19.0) as compared to those allotted in SOE (9.1–15.7). This shows that structuring the viva voce contents reduces the effect of contrasting examiner behaviors on the marks alloted (stringent vs. lenient).[17] However, structuring the contents significantly reduced the marks obtained and this is as reported elsewhere also.[18] Reduction in marks by structuring is not surprising because structuring exposes all students to all types of questions (from easy to tough) as against the traditional viva in which examiners preferences and chance plays role (some students are asked too many easy questions). This is corroborated by the perception of teachers who have clearly stated that SOE covers a wider breadth of syllabus as compared to conventional format. This could also in part be due to the fact that not all students respond to all questions having increasing difficulty levels.

We found that structuring in the present form does not eliminate inter-rater variability as is reported elsewhere also.[1] Conversely, another study [13] found perfect agreement between the marks given by two examiners in objective structured viva voce (OSVV) while the fair agreement was found between the marks given in OSVV and conventional viva. However, each student was subjected to two viva formats unlike this study on parallel groups. There can be different explanations for this discrepancy. During posttest interview of the teachers taking structured viva (Groups B and D), it was found that one of them used regional language frequently, while the other strictly adhered to the structured questionnaire. Hence, the inter-rater variability arose. The examiners of the four groups had a varied teaching experience (from 3 years to 33 years) which determined the depth and experience in evaluating students' performance. However, it is a realistic, practical situation, prevailing in most of the Departments of Pharmacology. No examiner was properly trained in this method of assessment. Training by organizing workshops and developing orientation manuals, is important for increasing effectiveness of examination.[19],[20] Development of ability in examiners to ask relevant questions in unambiguous words so that almost similar answer comes from all students that increases validity further. This aspect of faculty development is being realized to be important nowadays.[21] Inter-rater reliability can further be enhanced by the use of grading or scoring system.[22],[23]

The perception of the students to this form of viva is found to be encouraging.[15],[24] In this study, students did not perceive any threat of a new format of examination and perceived structured format similar to conventional viva with respect to understanding and responding to the questions. However, most students felt that this format required greater preparation than conventional viva.

Availability of time and human resources are important determinants of the feasibility of an effective evaluation tool. At least four examiners are needed to conduct university practical examination and viva voce for 100 undergraduates and one more for every increase in 50 students. It is customary in conventional format to divide total subject into two parts according to theory paper-I and II. One of the examiners takes viva from part-I and the other from part-II. However, often, time becomes a big constraint [Table 5]. What should be the appropriate time duration for structured viva voce in pharmacology? This is important because ultimately 150–250 students would be required to be examined. In one study,[18] authors fixed time duration of 10–15 min for 10 items questionnaire in physiology with items having increasing difficulty levels. Six examiners were involved in taking viva each assessing 6–7 students only. In another study, 8 min were assigned for 8-item structured questionnaire during formative assessment in biochemistry.[13] Two examiners sat together for conducting structured viva. Increasing number of examiners may not be a practical proposition because of professional time needed, although it reduces inter-rater variability and improves reliability (agreement in allotted marks between two examiners). In a targeted viva, Rangacahri [9] engaged each student for about 25 min and thus only 5–6 students were examined in a day. We found that arbitrary limit of 10 min is not sufficient because students would not reach till question 14 and 15. One remedy is to reduce the number of items in questionnaire to 10 and other is to increase the time allotted to the student. Conversely, increasing the number of question increases inter-examiner reliability.[22] We feel that assessment of students' competence for certification (pass/fail) would require the development of questionnaire, which includes items from various systems. Therefore, reducing the number of items in the questionnaire may become counterproductive! As shown in [Table 5], increasing time duration from 10 to 15 min would result in continued work for 8–12 h a day! If one examiner examines half of the students by structured viva format (say 15 only) and the other the remaining half, then time duration can be increased from 10 to 15 min. The entire exercise could be tiring for examiner and stressful to the examinees. At present, little is known about the impact of these modifications on the scores obtained.
Table 5: Relationship between duration of viva, number of examiners required, and total duration required for proper conduct of SOE

Click here to view


The limitation of the present study is that it involved a single batch of students who were examined on ANS only. More work is required to be done on several batches of students to ascertain a number of questions from the entire syllabus and exact time duration to successfully implement the structured viva format for final university examinations.


 » Conclusion Top


Present work suggests that structured viva voice examination is a feasible method of assessment and students feel no difficulty to this format. Teachers opine that SOE prevents deviations from the main topic. However, its conduct is a resource-intensive exercise and requires preplanning. There are factors, which influence the performance of students and introduce an element of inter-rater variability. These factors are the length of teaching experience, vernacular used, and lack of training of teachers. There is a need of training of examiners, development of scoring system, and ascertaining time duration of viva voce examination before SOE can be implemented as a part of fulfillment of university requirement for the summative assessment of student's performance.

Financial Support and Sponsorship

Nil.

Conflicts of Interest

There are no conflicts of interest.

 
 » References Top

1.
Torke S, Abraham RR, Ramnarayan K, Asha K. The impact of viva-voce examination on students' performance in theory component of the final summative examination in physiology. J Physiol Pathophysiol 2010;1:10-2.  Back to cited text no. 1
    
2.
Thomas CS, Mellsop G, Callender K, Crawshaw J, Ellis PM, Hall A, et al. The oral examination: A study of academic and non-academic factors. Med Educ 1993;27:433-9.  Back to cited text no. 2
    
3.
Memon MA, Joughin GR, Memon B. Oral assessment and postgraduate medical examinations: Establishing conditions for validity, reliability and fairness. Adv Health Sci Educ Theory Pract 2010;15:277-89.  Back to cited text no. 3
    
4.
Evans LR, Ingersoll RW, Smith EJ. The reliability, validity, and taxonomic structure of the oral examination. J Med Educ 1966;41:651-7.  Back to cited text no. 4
[PUBMED]    
5.
Holloway PJ, Hardwick JL, Morris J, Start KB. The validity of essay and viva-voce examining techniques. Br Dent J 1967;123:227-32.  Back to cited text no. 5
[PUBMED]    
6.
Iqbal IZ, Naqvi S, Abeysundara L, Narula AA. The value of oral assessments: A review. Bull R Coll Surg Engl 2010;92:1-6.  Back to cited text no. 6
    
7.
Oakley B, Hencken C. Oral examination assessment practices: Effectiveness and change with a first year undergraduate cohort. J Hosp Leis Sport Tourism Educ 2005;4:3-14.  Back to cited text no. 7
    
8.
Schuwirth LW, van der Vleuten CP. Changing education, changing assessment, changing research? Med Educ 2004;38:805-12.  Back to cited text no. 8
    
9.
Rangacahri PK. The targeted viva. Adv Physiol Educ 2004;28:213-4.  Back to cited text no. 9
    
10.
Vu NV, Johnson R, Mertz SA. Oral examination: A model for its use within a clinical clerkship. J Med Educ 1981;56:665-7.  Back to cited text no. 10
[PUBMED]    
11.
Pernar LI, Sullivan AM, Corso K, Breen E. Oral Examination in Undergraduate Medical Education-What is the Value Added in the Assessment. Available from: http://www.hms.harvard.edu/sites/default/files/assets/Sites/Academy/files/Abstract%20Book%202012.pdf. [Last accessed on 2015 Apr 06].  Back to cited text no. 11
    
12.
Ghosh A, Mandal A, Das N, Tripathi SK, Biswas A, Bera T. Students' performance in written and viva-voce components of final summative pharmacology examination in MBBS curriculum: A critical insight. Indian J Pharmacol 2012;44:274-5.  Back to cited text no. 12
[PUBMED]  Medknow Journal  
13.
Puppalwar PV, Rawekar A, Chalak A, Dhok A, Khapre M. Introduction of objectively structured viva-voce in formative assessment of medical and dental undergraduates in biochemistry. J Res Med Educ Ethics 2014;4:321-5.  Back to cited text no. 13
    
14.
Patel BS, Kubavat A, Piparva K. Correlation of student's performance in theory and practical of final summative pharmacology examination in MBBS curriculum: A critical insight. Natl J Physiol Pharm Pharmacol 2013;3:171-5.  Back to cited text no. 14
    
15.
Shenwai MR, Patil KB. Introduction of structured oral examination as a novel assessment tool to first year medical students in physiology. J Clin Diagn Res 2013;7:2544-7.  Back to cited text no. 15
    
16.
Schubert A, Tetzlaff JE, Tan M, Ryckman JV, Mascha E. Consistency, inter-rater reliability, and validity of 441 consecutive mock oral examinations in anesthesiology: Implications for use as a tool for assessment of residents. Anesthesiology 1999;91:288-98.  Back to cited text no. 16
    
17.
Anshu. The oral examination. In: Singh T, Anshu, editor. Principles of Assessment in Medical Education. Ch. 15. New Delhi: Jaypee Brothers; 2012. p. 167-79.  Back to cited text no. 17
    
18.
Verma A, Mahajan N, Jasani K, Patel J. Evaluation and comparison of result: Conventional viva Vs structured viva. Glob Res Anal 2013;2:188-9.  Back to cited text no. 18
    
19.
Daelmans HE, Scherpbier AJ, Van Der Vleuten CP, Donker AJ. Reliability of clinical oral examinations re-examined. Med Teach 2001;23:422-4.  Back to cited text no. 19
    
20.
Des Marchais JE, Jean P, Delome P. Training in the art of asking questions at oral examinations. Ann R Coll Phys Surg Can 1989;22:213-6.  Back to cited text no. 20
    
21.
Rahman G. Appropriateness of using oral examination as an assessment method in medical or dental education. J Educ Ethics Dent 2011;1:46-51.  Back to cited text no. 21
  Medknow Journal  
22.
Pearce G, Lee G. Viva voce (oral examination) as an assessment method insights from marketing students. J Mark Educ 2009;31:120-30.  Back to cited text no. 22
    
23.
Muzzin LJ, Hart L. Oral examinations. In: Neufeld RV, editor. Assessing Clinical Competence. New York: Springer Publishing Co.; 1985.  Back to cited text no. 23
    
24.
Kshirsagar SV, Fulari SP. Structured oral examination-student's perspective. Anat Karnataka 2011;5:28-31.  Back to cited text no. 24
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]



 

Top
Print this article  Email this article
 

    

Site Map | Home | Contact Us | Feedback | Copyright and Disclaimer
Online since 20th July '04
Published by Wolters Kluwer - Medknow