We will be undertaking essential maintenance work on Apollo's infrastructure on Thursday 14 August and Friday 15 August, therefore expect intermittent access to Apollo's content and search interface during that time. Please also note that Apollo's "Request a copy" service will be temporarily disabled while we undertake this work.
Repository logo
 

Learners’ annotations and written markings when taking a digital multiple-choice test: What support is needed?

Published version
Peer-reviewed

Change log

Abstract

This research set out to enhance our understanding of the exam techniques and types of written annotations or markings that learners may wish to use to support their thinking when taking digital multiple-choice exams. Additionally, we aimed to further explore issues around the factors that contribute to learners writing less rough work and markings on scrap paper during a digital test than they write on paper-based tests, as observed in prior research. In this research, 52 learners attempted a digital economics test with access to either scrap paper or a print of the test. Some learners were observed in order to capture their interactions with the paper materials, all learners completed a questionnaire, and most learners were interviewed. The evidence collected provides insights regarding the types of annotations and written markings learners wished to use. Considerable variation was found in whether, and the extent to which, learners used paper materials. Scrap paper worked fairly well for some types of annotations or written markings, but not for others. The findings are informing additional developments to testing platform functionality.

Description

Journal Title

Research Matters

Conference Name

Journal ISSN

Volume Title

Publisher

Research Division, Cambridge University Press & Assessment

Publisher DOI

Publisher URL

Rights and licensing

Except where otherwised noted, this item's license is described as All rights reserved