Research Matters 38
Permanent URI for this collection
Browse
Recent Submissions
Item Metadata only Published version Peer-reviewed Does typing or handwriting exam responses make any difference? Evidence from the literature(Research Division, Cambridge University Press & Assessment, 2024-10-18) Lestari, SantiDespite the increasing ubiquity of computer-based tests, many general qualifications examinations remain in a paper-based mode. Insufficient and unequal digital provision across schools is often identified as a major barrier to a full adoption of computer-based exams for general qualifications. One way to overcome this barrier is a gradual adoption, involving a dual running of paper-based and computer-based exams. When an exam is offered in both modes, and results from both are treated as equivalent, the comparability between modes needs to be ascertained. This includes examining whether the mode in which students respond to extended writing questions such as essays, either by handwriting or by typing on the computer, introduces systematic differences. This article presents findings from a review of existing literature on writing mode effects. Specifically, it discusses findings on four comparability aspects: scores, marking, text characteristics and composing processes. It also offers recommendations for practice.Item Metadata only Published version Peer-reviewed Comparing music recordings using Pairwise Comparative Judgement: Exploring the judge experience(Research Division, Cambridge University Press & Assessment, 2024-10-18) Chambers, Lucy; Walland, Emma; Ireland, Jo; Crisp, VictoriaComparative Judgment (CJ) is traditionally and primarily used to compare written texts. In this study we explored whether we could extend its use to comparing audio files. We used GCSE Music portfolios which contained a mix of audio recordings, musical scores and text documents. Fifteen judges completed two exercises: one comparing musical compositions and one comparing musical performances. For each exercise, each judge compared 80 pairs of portfolios. Once judges had finished both exercises, they completed a questionnaire about their views and experiences of the method. Here, we present the judges’ perceptions of using CJ in this context with reference to the ‘Dimensions of judge decision-making’ model (Leech and Chambers 2022). We also compare the findings to those from text-based CJ studies. Leech, T., & Chambers, L. (2022). How do judges in Comparative Judgement exercises make their judgements? Research Matters: A Cambridge University Press & Assessment publication, 33, 31–47.Item Metadata only Published version Peer-reviewed How long should a high stakes test be?(Research Division, Cambridge University Press & Assessment, 2024-10-18) Benton, TomThis article discusses one of the most obvious questions in assessment design: if a test has a high stakes purpose, how long should it be? Firstly, we explore this question from a psychometric point of view starting from the (range of) minimum test reliability levels suggested in the academic literature. Then, by using published data on the typical relationship between the length, duration and reliability of exams, we develop a range of recommendations about the likely required duration of assessment. Secondly, to force deeper reflection on the results from the psychometric approach, we also compare the actual lengths of exams in England to those in other education systems around the world. Such comparisons reveal very wide variations in the amount of time young people are required to spend taking exams in different countries and at various ages. This article concludes with some reflections on how the length of exams relates to the purpose of the assessment or to how its results will be used.Item Metadata only Published version Peer-reviewed Research Matters 38: Autumn 2024(Research Division, Cambridge University Press & Assessment, 2024-10-18) Crisp, Victoria; Crisp, VictoriaResearch Matters is a free biannual publication which allows Cambridge University Press & Assessment to share its assessment research, in a range of fields, with the wider assessment community.Item Metadata only Published version Peer-reviewed Troubleshooting in emergency education settings: What types of strategies did schools employ during the COVID-19 pandemic and what can they tell us about schools’ adaptability, values and crisis-readiness?(Research Division, Cambridge University Press & Assessment, 2024-10-18) Constantinou, FilioWith crises such as epidemics, wars, wildfires, earthquakes, hurricanes and snow days becoming increasingly more common in various parts of the world, it is crucial that schools become crisis-ready. Crisis-readiness lies partly in the ability of schools to deliver “emergency education” (i.e., education in crisis situations) promptly and effectively. To support the delivery of emergency education, this study sought to document and examine the strategies employed by schools during a crisis, specifically the COVID-19 pandemic. Through analysing data collected from interviews with teachers based in different parts of Europe, the study identified a series of micro-level strategies used by schools to address the challenges posed by the pandemic. These micro-level strategies were subsequently analysed to develop a typology of overarching mechanisms, or macro-level strategies. As discussed in the article, apart from providing a useful starting point for any teachers required to deliver emergency education in the future, these emergency strategies also offer valuable insights into schools’ adaptability, values, and crisis-readiness. As such, they could prove very informative for both educational policy and practice.Item Metadata only Published version Peer-reviewed Core Maths: who takes it, what do they take it with, and does it improve performance in other subjects?(Research Division, Cambridge University Press & Assessment, 2024-10-18) Gill, TimCore Maths qualifications were introduced into the post-16 curriculum in England in 2014 to help students develop their quantitative and problem-solving skills. Taking the qualification should also give students confidence in understanding the mathematical content in other courses taken at the same time. In this article, we explore whether Core Maths is fulfilling its aims. In particular: Does Core Maths provide students with a benefit (in terms of attainment) in other, quantitative, Key Stage 5 subjects (e.g., A Level Psychology, BTEC Engineering)? We also investigate some aspects of the uptake of Core Maths: What are the background characteristics of Core Maths students (e.g., gender, prior attainment, ethnicity)? Which other qualifications (e.g., A Levels, BTECs, Cambridge Technicals) and subjects are students most likely to take alongside Core Maths? The main finding was that students taking Core Maths had a slightly higher probability (than those not taking Core Maths) of achieving good grades in some subjects taken concurrently. Uptake of Core Maths remains relatively low, so there is certainly scope for greater numbers of students to take advantage of the potential benefits of studying the qualification.