Repository logo
 

Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study.

Published version
Peer-reviewed

Type

Article

Change log

Authors

Westacott, Rachel 
Gurnell, Mark 
Wilson, Rebecca 
Meeran, Karim 

Abstract

OBJECTIVES: The study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mean positive cue rate for SBAs exceeded the null hypothesis guessing rate of 20%. DESIGN: This was a cross-sectional study undertaken in 2018. SETTING: 20 medical schools in the UK. PARTICIPANTS: 1417 volunteer medical students preparing for their final undergraduate medicine examinations (total eligible population across all UK medical schools approximately 7500). INTERVENTIONS: Students completed a 50-question VSA test, followed immediately by the same test in SBA format, using a novel digital exam delivery platform which also facilitated rapid marking of VSAs. MAIN OUTCOME MEASURES: The main outcome measure was the mean positive cue rate across SBAs: the percentage of students getting the SBA format of the question correct after getting the VSA format incorrect. Internal consistency, item discrimination and the pass rate using Cohen standard setting for VSAs and SBAs were also evaluated, and a cost analysis in terms of marking the VSA was performed. RESULTS: The study was completed by 1417 students. Mean student scores were 21 percentage points higher for SBAs. The mean positive cue rate was 42.7% (95% CI 36.8% to 48.6%), one-sample t-test against ≤20%: t=7.53, p<0.001. Internal consistency was higher for VSAs than SBAs and the median item discrimination equivalent. The estimated marking cost was £2655 ($3500), with 24.5 hours of clinician time required (1.25 s per student per question). CONCLUSIONS: SBA questions can give a false impression of students' competence. VSAs appear to have greater authenticity and can provide useful information regarding students' cognitive errors, helping to improve learning as well as assessment. Electronic delivery and marking of VSAs is feasible and cost-effective.

Description

Keywords

Assessment, MEDICAL EDUCATION & TRAINING, applied medical knowledge, Academic Performance, Clinical Competence, Cross-Sectional Studies, Decision Making, Education, Medical, Undergraduate, Educational Measurement, Humans, Knowledge, Learning, Schools, Medical, Students, Medical, Surveys and Questionnaires, United Kingdom

Journal Title

BMJ Open

Conference Name

Journal ISSN

2044-6055
2044-6055

Volume Title

9

Publisher

BMJ