|
|
RESEARCH ARTICLE |
|
|
|
Year : 2016 | Volume
: 48
| Issue : 7 | Page : 52-56 |
|
Introducing structured viva voce examination in medical undergraduate pharmacology: A pilot study
DC Dhasmana, Suman Bala, Rajendra Sharma, Taruna Sharma, Saurabh Kohli, Neeraj Aggarwal, Juhi Kalra
Department of Pharmacology, Himalayan Institute of Medical Sciences, Swami Rama Himalayan University, Jolly Grant, Dehradun, Uttarakhand, India
Date of Submission | 09-Jan-2016 |
Date of Acceptance | 05-Jun-2016 |
Date of Web Publication | 2-Nov-2016 |
Correspondence Address: D C Dhasmana Department of Pharmacology, Himalayan Institute of Medical Sciences, Swami Rama Himalayan University, Jolly Grant, Dehradun, Uttarakhand India
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/0253-7613.193308
Objective: Viva voce examination is an important tool of evaluation in medical examinations marred by high subjectivity. Gross subjectivity in viva voce assessment can be reduced by structuring it. Materials and Methods: The marks obtained in theory and viva voce (traditional viva voce examination [TVVE]) of I sessional, II MBBS students were compared and a huge disparity was identified. A structured viva voce examination (SVVE) was then proposed and experimented as an objective and standardized alternative. Sets of equitable question cards for SVVE were prepared, each having eight questions with two parts each, arranged successively with increasing difficulty, domains of learning, and appropriate marks. The percentage variation in scoring in viva versus theory marks was calculated for both TVVE and SVVE, and students were grouped as Group I (+100 to +51%); Group II (+50 to −50%); Group III (−51 to −100%); Group IV (−101 to −150%); Group V (−151 to −200%); and Group VI (< −200%) variation, as? inappropriate, appropriate, inappropriate, erroneous, more erroneous and most erroneous respectively. Student’s feedback on the SVVE was also obtained. Results: In TVVE (n = 128), the students distributed were:none,17.2%, 23.4%, 22.7%, 11.7% and 25% in Group I, II, III, IV, V, and VI in contrast to SVVE (n = 107) as 7.5%, 57.9%, 19.6%, 6.5%, 5.6%, and 2.8%, respectively. Marked disparity of TVVE was annulled with SVVE. Student’s feedback was quite encouraging with 83% overall acceptability and almost 66% preferred SVVE. Conclusion: SVVE was more realistic as compared to TVVE. Most of the students favored this approach. Key message: Structured viva voce examination (SVVE) is better and more realistic than traditional viva voce examination (TVVE). SVVE reduces subjectivity of viva, adds to uniformity in assessment and assesses higher domains of learning and communication.
Keywords: Learning and assessment, structured viva, theory and practical examinations
How to cite this article: Dhasmana D C, Bala S, Sharma R, Sharma T, Kohli S, Aggarwal N, Kalra J. Introducing structured viva voce examination in medical undergraduate pharmacology: A pilot study. Indian J Pharmacol 2016;48, Suppl S1:52-6 |
How to cite this URL: Dhasmana D C, Bala S, Sharma R, Sharma T, Kohli S, Aggarwal N, Kalra J. Introducing structured viva voce examination in medical undergraduate pharmacology: A pilot study. Indian J Pharmacol [serial online] 2016 [cited 2023 Sep 21];48, Suppl S1:52-6. Available from: https://www.ijp-online.com/text.asp?2016/48/7/52/193308 |
Viva voce examination is an integral part of our assessment system during undergraduate medical practical examinations in India.[1],[2],[3] In ancient times, in Gurukuls, this was only the most trusted way to judge the domains of higher learning. Over a period, other tools were added in the assessment and gradually the importance of viva voce lost shine. In the undergraduate medical education system in India, curricular guidelines of the Medical Council of India lay general emphasis on various methods of assessment of knowledge, skills, attitude, and communication.[4] In fact, in the university examination in pharmacology, 10% of total marks are assigned for traditional oral examination (traditional viva voce examination [TVVE]). It consists of grand viva voce examination primarily focused on theory course.
Viva voce examination per se is focused on comprehension–application of the basic knowledge and concepts and its implicit logical confirmation. It is precisely toward analyzing, creating, and evaluating the real depth of knowledge in the higher levels of cognitive domain.[3] Further, it does assess effectively the attitude, communication power, and convincing power of the affective domain[1] It has high face validity and assesses what cannot be assessed by a written examination.[3],[5] This is possible only when this tool is used thoughtfully, rationally, objectively, and relevantly.
There appears to be a variation on what constitutes the “traditional” oral examination (TVVE).[1],[3],[6],[7] Ideally, the viva voce assessment should match with theory to a large extent with a good correlation. Since the viva voce specially assess the higher domains of cognition and affect,[1],[3] the scoring in viva voce is expected to be more difficult than theory and obviously be following the theory from a lower side but not on the higher side of it (again with a rider that theory is assessed as per the standard norms!).
It is commonly observed that TVVE is held usually toward the end of the practical examination when a student is tired of practicals. It is taken casually and marred by high subjectivity of the examiners.[1],[5],[7],[8],[9] Examiners are confounded with their whims, fancies, and their way of understanding of the subject, their preferred content area, and by numerous other momentary environmental factors.[6],[7],[9] Moreover, it is totally oriented toward theory while the examinee’s concentration is mainly on practicals-related theory. Ironically, in pharmacology, practicals are the replica of the skills evaluation for a very limited segment of theory course. The student’s anxiety is primarily for the practicals with a tendency to ignore the viva voce examination; rather they are ill-prepared to theory viva as such during practical examinations. Conventional viva voce traditionally is taken by separate examiners, depending on their availability on the day of examination adding further variability in assessment.[3] The ego of the examiner, the difficulty level of the questions asked and their sequencing in asking is so haphazard that at times, the student comes out humiliated and dissatisfied.[7],[8] It is often frightening, intimidating, threatening, and gives way for confrontation rather discussion.[7],[8],[9] In spite of knowing, one cannot answer questions because of lack of recall time or the scolding that has been added during the viva voce examination, hence stressed.[9] Even examiners at times feel tired, uninterested, or they are short of time to give it adequate importance and they at times may prompt or help some of them.[7],[8] The examiners may discriminate on the basis of ethnicity, minorities, economic status, sex or even influenced by dress, personality, and verbal style of the candidate in viva voce.[7] Further, the weightage as such for viva is fixed to a bare minimum ~ 10%, probably due to inherent subjectivity of assessment. In traditional oral examination, more emphasis is given on comparison between students rather than individual achievements.[10]
At times, the student is left clueless by the examiner whether one is navigating in the right direction or not by not interrupting the examinee appropriately, wasting the time of both. Thus, the whole purpose of the viva voce gets defeated, and subjectivity based on the examiner, the student, and other situational factors creeps heavily during assessment.[5],[6] This highly biased state of viva voce needs to be sorted out rationally as viva-voce uniquely is still a strong pillar of assessment.[3],[9]
Thus, the primary fault in TVVE is unreliability[7],[11] due to confounding situational factors, anxiety of the examinee, and inconsistency of the examiners that needs a revamping.[2],[9] To look for such a discrepancy in assessment and the way it is presently conducted, we analyzed semester examination results and tried to make it a more viable, uniform, transparent, and effective tool of assessment by structured viva voce examination (SVVE). The purpose was to translate this tool into a robust method of assessment by which the student is made more comfortable and his/her actual learning is assessed in the desired domains.
» Materials and Methods | |  |
The marks obtained in theory and viva-voce, hereby designated as TVVE of II professional MBBS students, who appeared in I-sessional as part of formative and summative assessment, were analyzed. The course content was general pharmacology, autonomic pharmacology, diuretics, and part of cardiovascular pharmacology (Antianginal and Antihypertensive agents). In TVVE, the students were assessed in viva voce as per the routine in use for years without any precondition. Twenty marks (20% of the total marks) were allotted in I-sessional examination for viva voce without any control over time. The percentage marks obtained in theory and viva voce were compared, and the disparities in the two tools of assessment were identified as described vide infra. Based on the huge disparity of the TVVE obtained, a more refined, scripted, and structured SVVE was then proposed and experimented on the same students as an alternative to possibly minimize the student–teacher, teacher-situational, teacher–teacher, and student-topic bias in viva voce.
SVVE was readministered for the same course content after notification, giving them 2 weeks’ time for preparation, after obtaining the Institutional Ethics Committee approval and explaining them the purpose in advance with their verbal consent.[1] A total time of 10–12 min was fixed for the viva, spread over 2 days with separate sets of question cards having similar formatting, mutually agreed by all the faculty members [Table 1]. Different sets of question cards for SVVE were prepared from the same syllabus, each having eight questions and each question split into two parts, the second being the leading question of the first, exploring their level of understanding of the first. The second leading question was only asked if one had given the right answer to the first question but not the vice versa. The questions were arranged as per the increasing difficulty level and assessing higher cognitive domains of learning. Every effort was made to frame the questions from the core content of the syllabus and be clinically relevant [Table 2]. All efforts were made to avoid overlaps with the theory questions of the I sessional examination. The marks distribution was also successively built on difficulty index so that initial five questions (5 × 2 = 10 marks) were labeled as easy, two questions (2 × 3 = 6 marks) were moderate in difficulty, and one question (1 × 4 = 4 marks) was really difficult [Table 1]. Each student was asked to randomly select a question card and answer the questions at one of the viva voce stations manned by a single assessor. The students were divided into batches and were not allowed to communicate each other while undergoing this exercise. A total of 129 students of 2nd professional MBBS appeared in the I-semester examinations (TVVE) while 107 students from the same batch appeared for SVVE. | Table 1: Framework of the SVVE card: Distribution of 8 questions (20 marks) in two parts each; with time duration of 10-12 minutes for each viva voce examination
Click here to view |
 | Table 2: Distribution and categorization of appropriateness of the TVVE (n=128) and SVVE (n=107)
Click here to view |
All the assessors agreed on the content, marking, and difficulty index of the question cards developed. The key to answers was also made and agreed by all examiners (assistant professors and above). Care was taken to make these cards equitable in all aspects so that any card could achieve similar assessment of a student. The percentage theory and percentage viva voce marks were calculated, and then the percentage variation in scoring of the two tools keeping theory marks of TVVE as denominator (presuming to be more accurate) was calculated for each student as follows:

Based on the percentage variation in the assessment (theory vs. viva voce), all the eligible students were divided into six groups as Group I (+100 to + 51); Group II (+50 to −50) that was further segregated into IIa (+50 to + 6); IIb (+5 to −5); IIc (−6 to −50); Group III (−51 to −100); Group IV (−101 to −150); Group V (−151 to −200), and Group VI (<−200) [Table 2].
Students were subjected to a feedback on the SVVE at the end, to know their views on SVVE by a simple criterion-based qualitative questionnaire agreed by all faculty members as assessors involved in evaluation. It was an exploratory study rather hypothesis-driven one; hence, no analytical statistics was applied. However, Chi-square test was performed on the percentage proportion of students showing disparity in between SVVE and TVVE [Figure 1], with P < 0.05 considered statistically significant. | Figure 1: Bar-diagram showing the distribution and disparity of SVVE (n=107) over TVVE (n=128) in different Groups (*P<0.01 and **P<0.0001 versus corresponding TVVE readings)
Click here to view |
» Results | |  |
There were 129 students who appeared in I sessional (pharmacology) examination in all [Table 2]. One student did not appear for the practical examination including viva voce (TVVE). Of the remaining 128 students, their results of theory and viva voce (TVVE) were analyzed. None was found in Group I, only 17.2% students were found in Group II, (Group IIa, nil; Group IIb, 1.6%; Group IIc, 15.6%), 23.4% in Group III, 22.7% in Group IV, 11.7% in Group V, and 25% in Group VI. TVVE results clearly showed that only 2 students were assessed most appropriately in viva correlating with the theory scores (Group IIb). In Group II, where we assumed the assessment (±50% variability) likely to be appropriate to some extent, it was only observed in 17.2%. Further, 23.4% were assessed inappropriately and 59.4% were assessed erroneously (Group IV, V, and Group VI). Thus, an erroneous negative trend in viva voce scoring versus theory was evident in 76 students out of 128 (~60%). No student was awarded less marks in viva voce that is evident by nil entry in Group I with TVVE. In total, a huge disparity was evident in the present TVVE with more liberal scoring in viva voce.
On readministering the prior informed viva voce (SVVE), on the same course content after a while, a total of 107 students voluntarily participated [Table 2]. Of the 107 students, 7.5% of students were found in Group I against nil in TVVE; 57.9% in Group II against 17.2% in TVVE; 19.6% in Group III against 23.4% in TVVE; 6.5% in Group IV against 22.7% in TVVE; 5.6% in Group V against 11.7 in TVVE, and 2.8% in Group VI against 25% in TVVE, respectively. Results clearly showed that with SVVE, some positive values were also obtained in Group I (7.5%) against nil in TVVE. Totally, 8 students were assessed most appropriately against 2 in TVVE. The magnitude of appropriate assessment (as per our assumption of ± 50% variability) was obtained in 58% against 17.2% in TVVE. Erroneous scoring (Group IV, V, and VI) was reduced to 16 students out of 107 (~15%) against 76 students out of 128 (~60%) in TVVE.
As shown in the bar diagram [Figure 1], the distribution of the data obtained by the SVVE was more uniformly distributed around the central tendency of the expected appropriate readings (±50%) than the data of TVVE which are skewed negatively toward right from the expected appropriate readings (±50%). The TVVE had more readings on the right side of the appropriate scale, implying that on most occasions, students obtained better marks in viva voce compared to their theory marks. Highly significant differences (P < 0.01) were observed for all comparisons except for Group III where it was comparable.
Feedback from Students
We also inquired into student’s feedback on SVVEs. The same total of 107 students who appeared for the SVVE had provided their written feedback on the SVVE methodology anonymously immediately after SVVE. The details of the responses are represented in [Figure 2]. Majority of students felt better with SVVE in terms of appropriate time given for study (92%); appropriate time for the test (90%); effective coverage of the topic (71%); felt it as good learning experience (87%); happy with the successive level of difficulty (93%); preference over TVVE (66%); and an overall acceptance in terms of total score (83%). | Figure 2: BAR-diagram of Students feedback on SVVE assessment (n=107) as yes or no answer on 6 questions. Overall effectiveness was rated by mean of all the scores at the end
Click here to view |
» Discussion | |  |
There appears to be a variation in scoring in traditional oral examination that is subjected to student, subject topics, teacher, and situational bias of oral viva voce examination as reported earlier.[1],[7] Very few studies have been reported in India that addressed the problems of viva voce examination in medical course prospectively[2],[8] and retrospectively.[3] In one study, it was reported that 79% students of medicine and 70% students of engineering had a strong belief that viva voce could be biased on various counts, i.e. teacher preference to students, pattern of question, and difficulty of question that changes from student to student.[1] In fact, world-leading medical institutions in the United States have dropped routine oral viva voce long back because of low reliability and validity[7] and instead use it exceptionally for borderline and exceptionally good students.[3],[9]
The methodology used in our study was somewhat similar to the method used elsewhere.[5] Our results of TVVE do not collate at all with the theory assessment, rather viva voce scoring was more exaggerated than theory for most of the students as observed earlier[2],[3],[9] while contrary findings are also reported.[8] Hardly, any student in TVVE got less percentage marks in viva vis-a-vis theory marks. Almost 60% of students were awarded erroneously.
SVVE depicted more appropriately to theory scoring and percentage scoring was better distributed on either side of the theory. These findings strongly support the use of SVVE over TVVE as an assessment tool for medical undergraduates during their sessional examinations to some extent as observed in earlier studies.[1],[5] With an element of positive skewness, still a perfect bell-shaped curve was not achieved with SVVE implying further scope for refinements in the remaining subjectivity. Further, we found that only 1.6% students were assessed most appropriately with TVVE that was increased to 7.5% with SVVE. On the other hand, erroneously examined proportion of 60% with TVVE was reduced merely to 15% with SVVE, confirming an earlier report.[5]
There could be multiple advantages of using SVVE as wide coverage, better expression, disinhibition, reduced anxiety, and shyness of the students in contrast to TVVE that is marred with high subjectivity, lack of a format and uniformity,[2] and unreliability.[7],[11] It is known that performance in viva has an inverse relationship with anxiety[12] that can be dampened to a large extent by SVVE. With SVVE, student–teacher relationship also improves. Student gets multiple chances. Student–teacher and student-topic bias are not there. Chance and luck factor gets minimized and there is more uniformity, transparency, and fairness of examination and results. SVVE adds more than merely regurgitating information and assesses appropriately the higher domains of learning for formative assessment.[1] Communication skill that is very well tested by viva voce examination has recently been emphasized as a separate domain of learning, besides knowledge, skill, attitude in MCI 2015 vision and its ATCOM module.[13] Further, it looks less time-consuming, more precise, and has a better discriminatory power among the students.[8],[9]
One drawback of TVVE is that it is “norm referenced” where more emphasis is given on comparison between students rather than individual achievements and the marks awarded, reflect only the general performance of the candidate without evaluating their individual competencies.[7],[12] They can be to some extent obviated by SVVE as in our case. Our SVVE was based not only on recall but also on explanation, comprehension, correlation, analysis, interpretation, and application that were successively and sequentially built as also stressed in other studies.[10],[14] The student’s feedback in our study supported the SVVE as assessment methodology far better than TVVE too. Overall, the students felt more comfortable, relaxed, and more confident with SVVE. There is a possibility of predicting the style of learning rather than overall reproducible learning by SVVE as observed in objective structured clinical examination;[15] however, in the present format, it is minimized and promotes learning. Concerns of student’s freedom of expression and holistic approach being compromised by SVVE[8] have been taken care in our study by asking them quasi-open questions at moderate to difficult level. However, some unforeseen problems with SVVE that are not obvious in this pilot study cannot be denied at later stages of its application to university professional examinations. Worst of all could be the time spent for preparation of equitable cards for SVVE that are mutually agreed among the faculty members. Further, question cards need to be reframed/modified specifically for each examination based on the specific theory paper questions. Allotment of marks to individual questions may be difficult to standardize. All is possible with a committed faculty initiative and engagement with an organized manpower use.
» Conclusion | |  |
SVVE stands far better than TVVE which looks more of a ritual to pass the student than an objective valid and reliable tool. Student’s perception of SVVE was encouraging and was up to their utmost satisfaction. Many bias and subjectivity of viva can be reduced by introducing SVVE in spite of limitations of time constraints, availability of faculty, and initiative to bring out such changes.[9] It adds to uniformity in all aspects and assesses higher domains of learning and communication. A more robust beginning is need of the hour to shift from TVVE to SVVE, as breaking the tradition could be really difficult.[7] This pilot study may be replicated in larger scale and in a variety of ways as there is immense scope for improvement in viva voce examinations.
Financial Support and Sponsorship
Nil.
Conflicts of Interest
There are no conflicts of interest.
» References | |  |
1. | Ray MK, Ray S, Ray U. Technology enabled assessment of viva voce: A new challenge. J Adv Res Biol Sci 2013;5:238-42. |
2. | Ghosh A, Mandal A, Das N, Tripathi SK, Biswas A, Bera T. Student’s performance in written and viva-voce components of final summative pharmacology examination in MBBS curriculum: A critical insight. Indian J Pharmacol 2012;44:274-5.  [ PUBMED] |
3. | Torke S, Abraham RR, Ramnarayan K, Asha K. The impact of viva-voce examination on student’s performance in theory component of the final summative examination in physiology. J Physiol Pathophysiol 2010;1:10-2. |
4. | |
5. | Verma A, Mahajan N, Patel J. Evaluation and comparison of results: Conventional viva vs. structured viva. Glob Res Anal 2013;2:188-90. |
6. | Singel TC, Shah C, Dixit D. Small group structured oral examination: An innovation in oral examination. Natl J Integr Res Med 2014;5:141-4. |
7. | Davis MH, Karunathilake I. The place of the oral examination in today’s assessment systems. Med Teach 2005;27:294-7.  [ PUBMED] |
8. | Shaikh ST. Objective structured viva examination versus traditional viva examination of medical students. Anat Physiol 2015;5:175. |
9. | Haque M, Yousuf R, Abu Bakar SM, Salam A. Assessment in undergraduate medical education: Bangladesh perspectives. Bangladesh J Med Sci 2013;12:357-63. |
10. | Sandila MP, Ahad A, Khani ZK. An objective structured practical examination to test students in experimental physiology. J Pak Med Assoc 2001;51:207-10.  [ PUBMED] |
11. | Muzzin LR, Hart L. Oral examinations. In: Neufeld VR, Norman GR, editors. Assessing Clinical Competence. New York: Springer Publishing Company; 1985. p. 71-93. |
12. | Holloway PJ, Hardwick JL, Morris J, Start KB. The validity of assays and viva voce examining techniques. Br Dent J 1967;123:227-32.  [ PUBMED] |
13. | |
14. | Anastakis DJ, Cohen R, Reznick RK. The structured oral examination as a method for assessing surgical residents. Am J Surg 1991;162:67-70.  [ PUBMED] |
15. | Martin IG, Stark P, Jolly B. Benefitting from clinical experience: The influence of learning style and clinical experience on performance in an undergraduate objective structured clinical examination. Med Educ 2000;34:530-4.  [ PUBMED] |
[Figure 1], [Figure 2]
[Table 1], [Table 2]
This article has been cited by | 1 |
Structured viva validity, reliability, and acceptability as an assessment tool in health professions education: a systematic review and meta-analysis |
|
| Abdelhamid Ibrahim Hassan Abuzied, Wisal Omer Mohamed Nabag | | BMC Medical Education. 2023; 23(1) | | [Pubmed] | [DOI] | | 2 |
A study to assess the reliability of structured viva examination over traditional viva examination among 2nd-year pharmacology students |
|
| Marya Ahsan, AyazKhurram Mallick | | Journal of Datta Meghe Institute of Medical Sciences University. 2022; 17(3): 589 | | [Pubmed] | [DOI] | | 3 |
Technology-assisted viva voce exams: A novel approach aimed at addressing student anxiety and assessor burden in oral assessment |
|
| Sean R. Alcorn, Matthew J. Cheesman | | Currents in Pharmacy Teaching and Learning. 2022; | | [Pubmed] | [DOI] | | 4 |
Structured oral examination as an effective assessment tool in lab-based physiology learning sessions |
|
| Lifeng Wang, Ahmad Taha Khalaf, Dongyu Lei, Mengke Gale, Jing Li, Ping Jiang, Jing Du, Xuehereti Yinayeti, Mayinuer Abudureheman, Yuanyuan Wei | | Advances in Physiology Education. 2020; 44(3): 453 | | [Pubmed] | [DOI] | |
|
 |
|
|
|
|