Research Matters 27


Recent Submissions

Now showing 1 - 6 of 6
  • ItemPublished versionOpen Access
    Research Matters 27: Spring 2019
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Bramley, Tom
    Research Matters is a free biannual publication which allows Cambridge University Press & Assessment to share its assessment research, in a range of fields, with the wider assessment community. 
  • ItemPublished versionOpen Access
    What makes researchers anxious? It’s Time to Talk about talking about research
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Elliott, Gill; Suto, Irenka; Walland, Emma
    Increasingly, workplaces provide accessible places and events for employees to discuss mental health issues. In the course of discussions in our organisation, researchers raised the issue of workplace activities that cause them anxiety - top of the list being giving presentations. This article describes a workshop held with researchers at Cambridge Assessment to encourage the discussion to develop more widely. We reflect upon themes about workplace anxiety identified by the workshop participants, relate the themes to the literature, and consider practical techniques for reducing the impact of anxiety. Our aim is that this study will be helpful in enabling other researchers to become more open in discussing the aspects of their role which cause anxiety. In turn, this will facilitate finding ameliorative solutions to the issues raised, to the mutual benefit of individuals and organisations.
  • ItemPublished versionOpen Access
    The art of test construction: Can you make a good Physics exam by selecting questions from a bank?
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Bramley, Tom; Crisp, Vicki; Shaw, Stuart
    In the traditional approach to constructing a GCSE or A Level examination paper, a single person writes the whole paper. In some other contexts, tests are constructed by selecting questions from a bank of questions. In this research, we asked experts to evaluate the quality of Physics exam papers constructed in the traditional way, constructed by expert selection of items from a bank, and constructed by computer selection of items from a bank. Anecdotal evidence suggested a "compilation" process would be detrimental to the quality of this kind of exam. We wanted to test whether in fact assessment experts could distinguish between tests that had been created in the traditional way, and those that had been compiled by selection from a bank, when they were unaware of the method of construction.
  • ItemPublished versionOpen Access
    Moderating artwork: Investigating judgements and cognitive processes
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Chambers, Lucy; Williamson, Joanna; Child, Simon
    In this article, we explore the cognitive process and resources drawn upon when moderating artwork. The cognitive processes involved in the external moderation of non-exam assessments has received little attention; the few research studies that exist investigated moderation where the candidates' submissions were in mostly written form. No studies were found which explicitly looked at a non-written submission, such as artwork. In this small- scale study, participating moderators were asked to "think aloud" whilst moderating candidates' Art and Design submissions. An analysis of the resulting verbal protocol and observational data enabled timelines of moderator activity to be produced. From these, a process map containing moderation stages, activities, cognitive processes, and resource use was developed.
  • ItemPublished versionOpen Access
    Indirect assessment of practical science skills: Development and application of a taxonomy of written questions about practical work
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Wilson, Frances; Shaw, Stuart; Wade, Neil; Hughes, Sarah; Mattey, Sarah
    Practical work is central to science education and is used not only to support the development of conceptual knowledge, but to enable students to develop a wide range of skills, including data handling, experimental design and equipment manipulation. Practical work may be assessed using many different forms of assessment, including coursework projects, practical exams, and written questions in exams. Different forms of assessment may assess different aspects of this complex domain. As such, it is important to establish a clear understanding of the skills and knowledge which are assessed by each form of assessment. This article describes a study in which the development of a taxonomy is first described, then its application in evaluating current science qualifications explored. A taxonomy of practical science skills on questions about practical science has the potential to allow evaluation and monitoring of practical science questions for awarding organisations as it allows for comparisons of skills assessed over time, between papers, and between subjects. Cambridge Assessment International Education assessments include written practical science questions (as an alternative to practical exams) and so an evaluation as to how the skills assessed have changed over time, and how they vary between subjects, would provide additional information on the performance of the assessments. OCR assessments of written practical science were introduced in 2016 as a result of reforms in the qualifications. Therefore, evaluating whether the assessments are similar to Sample Assessment Materials, and how subjects compare to one another in terms of the skills assessed, would aid the meeting of regulatory requirements to evaluate and monitor the new assessments.
  • ItemPublished versionOpen Access
    Data, data everywhere? Opportunities and challenges in a data-rich world
    (Research Division, Cambridge University Press & Assessment, 2019-03-01) Raikes, Nick
    Data has always been central to large-scale educational assessments, such as those provided by Cambridge Assessment, but digital learning and assessment products vastly increase the amount, types, and immediacy of data available. Advances in data science and technology have opened up this data for analysis as never before and "big data" is much hyped by both those that fear it and those that welcome it. But, as yet, there has been no big data revolution in education. In this article, I briefly outline current uses of data analytics at Cambridge Assessment, and how big data extends them. I explore some likely future uses and benefits of big data in relation to formative assessment and individualised recommendations for learners, together with the risks of statistical naivety and data misuse, and the challenge of gaining the trust of students, parents and teachers.