Summaries of the winning projects from the 2020 Awards
Best Research 2020
University of Reading. Professor Emma Mayhew, EMA Programme
Adoption of online submission and feedback is increasing significantly across the higher education sector. The majority of institutions in the UK have now identified themselves as moving away from pocketed, disparate use towards embedding institution-wide online assessment practices. Providers are driven by a range of benefits for staff, students and the broader institution. Research has started to explore the impact of change but there has been very little sector-wide analysis exploring the challenges faced by institutions moving to adopt online submission and feedback.
This research explores a range of barriers identified by providers which have the potential to prevent, delay or reduce the benefits from undertaking change. It outlines the results of an extensive literature review which identifies challenges surrounding change design, stakeholder management, policy, process and technical integration but also finds that a number of institutions have been able to address some of these concerns, leaving them in sector leading positions.
This research is designed to enable institutions who might be intending to implement institution-wide change in the future or who are currently undertaking change to be aware of these barriers, and of the experience of others in order to inform their own good practice, policy and pedagogy.
Best Transformational Project 2020
University of Reading. EMA Programme
The EMA Programme at Reading has undertaken an extensive range of transformational work to realise a University ambition to move towards large scale adoption of online assessment.
Since 2016, the Programme team have designed a non-directive, staged transition, sensitive to disciplinary differences, supported by 500+ engagement and communication sessions encouraging meaningful stakeholder involvement. We completed a complicated piece of work to enable tens of thousands of individual marks to be recorded on new marks screens in our student records system, significantly expanding the use of data to support students and plan assessment. Drawing on this, we undertook a complex project
to develop and launch a sector leading Student Progress Dashboard in November 2019. It presents a series of screens displaying core progress data in a range of graphics, to encourage student self-reflection, planning and enhanced staff-student support conversations.
We have undertaken innovative and award-winning work to integrate our Virtual Learning Environment (Blackboard) and student records system (SITS), reducing workload. We have developed a wide range of written and video materials ordered around an online map, advised senior leadership in other institutions, run a national EMA conference and published widely to help support transition across the sector.
Best Use of Formative Assessment 2020
AlphaPlus Consultancy Ltd. National Online Personalised Assessments in reading and numeracy on behalf of the Welsh Government
In Wales, all children aged 7-14 take national assessments in procedural numeracy (number, measuring, data skills), reading and numerical reasoning (solving problems). These assessments are designed to identify areas where children and teachers need to focus their efforts in order to make progress. Previously, these assessments were taken on paper during the summer term. However, the Welsh Government has decided to move the assessments on-screen and on-demand, starting from 2018.
AlphaPlus is leading the multi-partner project team, developing assessment infrastructure and content for the new ‘personalised assessments’, while also delivering the legacy on-paper assessments.
To date, over 370,000 on screen assessments have been delivered in every maintained primary and secondary school in Wales. The assessments are highly innovative:
- adaptive - they respond to the capabilities of learners
- accessible – every child takes the same test - all onscreen, no paper test alternatives
- calibrated - teachers get nationally-comparable, trustworthy results
- easy to administer
Feedback is timely and focused, without any marking or analysis by teachers.
The project represents a very substantial achievement – onscreen rollout of a statutory national assessment programme which meets the needs of teacher and students, and delivers real benefits to teaching and learning.
The judging panel reported:
"From the outset, the way in which the programme of work met the criteria was explicitly stated. The submission and subsequent presentation were well-structured, clear and concise. The evidence provided to support the detail in the submission was relevant and extensive.
The approach to ensuring the reliability and validity of the assessments was extensive. The judges particularly liked the existence of, and reference to, the expert advisory group.
The extensive nature of the programme, its delivery to time and budget and the successful managing of multiple stakeholders was impressive.
As well as providing a national means of providing formative feedback in a way that met schools’ needs, the way in which the programme developed teachers’ assessment literacy was an interesting by-product.
The scale of the achievement impressed the judges particularly in a highly visible and risk-averse context of national policy and its implementation."
Best Use of Summative Assessment 2020
Newcastle University with Newcastle University Digital Exam Service
The Newcastle University Digital Exam Service has grown in scale to be one of the leading providers of online summative assessment in the UK HE sector.
The service is a shining example of successful collaboration between many key stakeholders across the organisation. Aligning closely with institutional strategy, the service provides a secure, efficient and reliable method of delivering high-stakes summative exams to a large number of students every year.
The service helps the University with its commitment to lowering its environmental impact by greatly reducing the amount of paper used during the exam period. Exam administration and moderation processes are streamlined. This has also led to efficiencies and time-savings for academics by providing automated marking and feedback.
Student feedback regarding their experience of our online exams is extremely positive, and this is being improved further with our project to expand and diversify the online exam provision. The service provision is regularly evaluated and improved upon. This process has helped to increase the popularity of the service across the service as staff view the service extremely positively.
The judging panel reported:
"Newcastle University was a very well presented project which clearly met a defined need and worked across different activity streams within the university to deliver summative assessment at scale. The programme is well established and the data and student testimonials indicated its high level of success since its inception. Plans to expand and cater for more exam types and the work done on accessibility were particularly noteworthy."
Excellence in Export 2020
PECS Data Services
Since 2010 The Western Cape Education Department have been the only province in South Africa to carry out systemic testing for all grade 3,6 and 9 learners. In that time PECS Data Services, a BPO company from the UK that specializes in exam processing have been the only ever present delivery partner, providing software to manage the tracking of the tests and the marking, together with the scanning, data capture and processing of all the scripts.
Most recently (2019), PECS again successfully delivered the project with their current partners, DarkData and The University of Cape Town in record time.
Almost 600,000 scripts were scanned, marked and processed in less than 6 weeks, with PECS utilising their UK, Mauritian and South African presence, making the delivery of this high stakes project truly international.
ACTNext by ACT, Inc. with an Innovative tool for the Measurement of a 21st Century Competence - Collaborative Learning and Problem Solving (CPS)
Collaborative problem solving (CPS) is prominently important 21st century skills and is widely considered to be a core competency in the workforce and is a key component of several standards for education.
As pointed out by researchers specializing in educational assessment and 21st century skills, the field currently suffers from a scarcity of viable and robust tools for assessing these competencies, especially in K-12 learning contexts. In fact, the challenge of building valid assessments for CPS, cognitive and social-emotional learning (SEL) skills is two-fold. First, there is a lack of consensus on what the CPS competency model looks like and exactly which behaviors constitute evidence for CPS skills. Second, these skills, which are the primary evidence about collaboration, can be difficult to identify in any context, much less in ecologically valid collaborative experiences.
To overcome these challenges, we developed an innovative assessment tool “CPSX - Crisis in Space”, for measuring students’ collaborative, cognitive, and SEL skills. We implement a new area, Computational Psychometrics, and utilize Artificial Intelligence and Machine Learning algorithms of data from rich, immersive interactions in a multitude of sensory modalities (multimodal data), can provide a new generation of assessments of 21st century competencies such as CPS competence.
Lifetime Contribution 2020
Professor Diana Laurillard
Professor Diana Laurillard is currently Chair of Learning with Digital Technologies at UCL Knowledge Lab (2005–present). Previous appointments include Head of the e-Learning Strategy Unit in the UK government’s Department for Education and Skills (2002–05), and Pro-Vice-Chancellor (Learning Technologies and Teaching) at the Open University (1998–2002). She is a globally renowned speaker on digital education and digital methods of assessment for learning.
Her most recent book is Teaching as a Design Science (2012). Her previous book, Rethinking University Teaching (2002), is one of the most widely cited in the field. The conversational framework developed in this book has been impactful for evaluating e-assessment and assessment at scale via peer review, and continues to inform learning design in the higher education and educational technology sectors.
She is currently leading research projects on developing the Learning Designer suite of tools and online community for teachers and trainers; adaptive games apps for learners with low numeracy and dyscalculia; and the use of MOOCs for professional development courses, and as a research tool. Her work has been of particular relevance to scholarly communities engaged in designing learning and teaching at scale.
Laurillard’s work on pedagogy, namely the conversational framework, helped inform the design of community-supported learning on FutureLearn, a website launched in 2013 by the Open University as the UK’s first MOOC platform. This model addresses the challenge of delivering learning and assessment at scale by harnessing the power of the community of learners, encouraging conversation throughout each course and allowing learners to share knowledge, experience and skills. Today, FutureLearn hosts nearly 9 million learners and is the leading social learning platform for MOOCs and online degrees.
Laurillard is currently FutureLearn’s Academic Advisor, overseeing data-driven research by Ph.D. students and helping to inform new product developments and pedagogical tools. She is also lead educator on the University of Leeds’ Blended Learning Essentials suite of courses on FutureLearn, which has totalled more than 60,000 learners historically.
Her career honours include Life Membership of the Association for Learning Technology, Fellow of the Royal Society of Arts, and honorary doctorates from Edinburgh Napier University, the University of Brighton, the Open University of the Netherlands, and the University of Abertay. She has an honorary Life Membership of the Association for Learning Technology, and was a member of the Governing Board, UNESCO Institute of IT in Education from 2003–13.