Repository logo
 

Item level examiner agreement

Published version
Peer-reviewed

Change log

Authors

Raikes, Nick 
Massey, Alf 

Abstract

Studies of inter-examiner reliability in GCSE and A-level examinations have been reported in the literature, but typically these focused on paper totals, rather than item marks. See, for example, Newton (1996). Advances in technology, however, mean that increasingly candidates' scripts are being split by item for marking, and the item-level marks are routinely collected. In these circumstances there is increased interest in investigating the extent to which different examiners agree at item level, and the extent to which this varies according to the nature of the item. Here we report and comment on intraclass correlations between examiners marking sample items taken from GCE A-level and IGCSE examinations in a range of subjects. The article is based on a paper presented at the 2006 Annual Conference of the British Educational Research Association.

Description

Keywords

Marking, Examination statistics

Journal Title

Research Matters

Conference Name

Journal ISSN

Volume Title

Publisher

Research Division, Cambridge University Press & Assessment

Publisher DOI

Publisher URL

Relationships