Using very short answer errors to guide teaching.
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
BACKGROUND: Student performance in examinations reflects on both teaching and student learning. Very short answer questions require students to provide a self-generated response to a question of between one and five words, which removes the cueing effects of single best answer format examinations while still enabling efficient machine marking. The aim of this study was to pilot a method of analysing student errors in an applied knowledge test consisting of very short answer questions, which would enable identification of common areas that could potentially guide future teaching. METHODS: We analysed the incorrect answers given by 1417 students from 20 UK medical schools in a formative very short answer question assessment delivered online. FINDINGS: The analysis identified four predominant types of error: inability to identify the most important abnormal value, over or unnecessary investigation, lack of specificity of radiology requesting and over-reliance on trigger words. CONCLUSIONS: We provide evidence that an additional benefit to the very short answer question format examination is that analysis of errors is possible. Further assessment is required to determine if altering teaching based on the error analysis can lead to improvements in student performance.
Description
Funder: National Institute for Health Research (NIHR) Applied Research Collaboration (ARC) West Midlands
Funder: National Institute for Health Research Cambridge Biomedical Research Centre; Id: http://dx.doi.org/10.13039/501100018956
Funder: Medical Schools Council Assessment Alliance
Keywords
Journal Title
Conference Name
Journal ISSN
1743-498X