Summaries of the winning projects from the 2019 Awards
Best Research 2019
A comparative study of online remote proctored versus onsite proctored high-stakes exams
Advances in technology have spurred innovations in secure assessment delivery. One such innovation, remote online proctoring, has become increasingly sophisticated and is gaining wider consideration for high-stakes testing, but many organisations still need convincing of its comparability to onsite proctoring. However, there is an absence of published research examining remote online proctoring and its effects on test scores and the examinee experience.
This paper describes a quasi-experimental field study carried out with three professional licensing examinations administered concurrently at different test sites that offered either onsite proctoring in testing centres or remote online proctoring in computer kiosks where the testing was proctored via Internet-connected video communication and surveillance.
Results using both classical test theory and item response theory methods revealed substantial reliability and a strong degree of measurement equivalence across proctoring conditions. Candidates revealed slightly less positive reactions to some of the remote proctored testing conditions, but reactions were positive overall and had virtually no relation to test performance. Overall, the results of this study support the equivalence of kiosk-based remote online proctored exams and exams proctored onsite in test centres.
Best Transformational Project 2019
MyKnowledgeMap and Anglia Ruskin University
Anglia Ruskin University is the largest provider of nursing education in the East of England. With over 2,000 pre-registration student nurses and 5,000 mentors, the university has embarked on a complete transformation journey, innovating their approach to assessing students on placement, moving from paper to mobile, and dramatically enhancing the student and mentor experience.
With student nurses dispersed across 200 placement areas and three campuses, supporting students on placement with a paper-based assessment portfolio posed many challenges for the university. Students lacked the support they needed and Anglia Ruskin lacked insight into student progress, only receiving insight when mentors were able to visit students on placement. After a rigorous tender process, Anglia Ruskin University adopted Myprogress, an offline mobile assessment tool for assessing student on placement and over a three year period rolled out the technology across all three campuses.
Since adopting Myprogress, Anglia Ruskin have transformed the student and mentor experience, dramatically improved engagement and improved student feedback and feed-forward. In this time, the university has also reduced student attrition in their nursing programmes delivered in Cambridge from over 20% to 2.5% - a change that the project has positively contributed to.
Best Use of Formative Assessment 2019
Vretta with IntroMath
High first-year student drop-out rates have been an ongoing challenge for colleges across Canada. In 2004, the College Math Project was initiated by all colleges in Ontario to identify ways to improve student retention. Following 10 years of research, the results revealed that over one-third of first-year students were at risk of not completing their program due to their lack of foundational maths skills. This has led to a large gap in equity and attainment in mathematics, resulting in high drop-out rates.
Given this evidence, Vretta partnered with colleges to develop a unique assessment-for-learning platform called ‘IntroMath’ to transform the way students were being assessed. IntroMath initiates the student assessment through an upgrading assignment, which ensures that they have the necessary pre- requisite skills. This is followed by weekly cycles of interactive lessons and assessments to help them visualize, conceptualize, and engage with mathematics throughout the semester.
IntroMath has proven to reduce drop-out rates from 30% to below 10%. Numerous case studies have demonstrated that students of all ages and accommodations have benefitted immensely from IntroMath. Currently, IntroMath has been integrated for a wide variety of maths courses and is transforming the student e-assessment experience at over 70% of Canadian colleges.
Best Use of Summative Assessment 2019
Qpercom with Qpercom Observe
We would like to nominate our digital clinical assessment solution, Qpercom Observe, for the Summative Assessment award. Qpercom spun-out from the College of Medicine, Nursing and Health Sciences in NUIG (National University of Galway) Ireland in 2008. Qpercom provides advanced assessment solutions to over 25 institutions worldwide, including Dundee University and the National University of Singapore. Our company mission is to advance global standards of assessment. Our solutions are developed from in-house research and development, and client collaborations with higher education institutions. Our core product, Observe, was developed to automate clinical assessments in medical education, to remove paper and increase quality standards. Every year, thousands of students perform OSCEs (Objective Structured Clinical Examinations), as part of their end of year exams. In one research study, an administrative error rate of 30% was discovered in exam correction. Automating exam delivery and cloud storage of data resulted in higher quality standards and zero correction time. More significantly, it facilitated psychometric analysis of exam data, such as, results, examiner participation, curriculum material, assessment forms, and overall exam performance. Observe is now used by healthcare education institutions worldwide to deliver high-stake clinical assessments.
Excellence in Export 2019
e-marking to the Caribbean
The Caribbean Examinations Council (CXC) is a regional examining body that underwrites final certificates as the application and verification of a common standard across 19 islands in the Caribbean.
Over time, CXC had found its model of examiners travelling to central marking centres on four islands to mark exam scripts was becoming unsustainable. The ever increasing costs associated with the logistics of transporting and accommodating examiners to and from islands had introduced a significant risk to CXC’s ability to administer examinations.
RM Results, a UK-based company specialising in providing e-assessment solutions to awarding organisations, met with CXC at the International Association for Educational Assessment (IAEA) conference in the Philippines in 2011.
RM Results introduced CXC to electronic marking (e-marking) software, the process of marking paper-based examination scripts by computer, as an enabler to improved efficiency and accuracy of the marking process.
International Well Control Forum (IWCF)
Easy as Pi, with a FORUM HUB
IWCF launched online assessments in December 2016 to their diverse membership base. With over 260 accredited training centres across the globe, centres in remote locations had difficulty transitioning to online assessments due to connectivity issues, meaning they couldn’t experience the benefits on offer including instant marking, cost reductions, instant resits and results as well as increased flexibility.
To move to a truly paperless environment and improve the quality and delivery of assessments on a global scale, IWCF and eCom Scotland worked together to develop the HUB solution, giving training centres with connectivity issues the ‘online’ experience. The HUB was designed so that centres could be issued with a pre-programmed Raspberry Pi device which can be plugged in to download and upload assessments from a secure online link for use at their centres. Assessments are then automatically marked, with results issued to candidates once the scripts are uploaded back into the centre FORUM database.
Lifetime Contribution 2019
Eric Shepherd has been an effective supporter of assessment community initiatives promoting interoperability and good practice in assessment for 20 years.
Eric was part of the IMS Technical Advisory Board from 1999 to 2004 and from 1997 to 2007 led the international IMS team, in developing and representing the IMS QTI interoperability standard. This was the first ever standard for exchange of questions between vendors and organizations.
Eric was the first Chairman of the European division of the Association of Test Publishers (ATP) from Dec 2006 – Jan 2008 and was on the ATP Board 2009 to 2012. Eric worked with the ADL initiative to define the Sharable Content Object Reference Model (SCORM) and with the Aviation Industry Computer-Based Training Committee (AICC) to define launch and track standards for Learning Management Systems. Eric has been a director of the HR Open Standards Consortium (formerly the HR-XML Consortium) from 2010 to 2018; they provide an open, transparent, trusted approach for HR data standards development.
Eric was born and grew up in the UK but moved to the US in his 20s. He led Questionmark, a UK company, to become a multi-million-dollar international software business with a passion for customer, employee and partner success. Eric recently stepped down as Questionmark’s CEO, after 18 years of service, and during his tenure he worked tirelessly to grow customer satisfaction, candidate satisfaction, partner satisfaction, platform features, revenue streams and an international partner channel. Under Eric’s leadership Questionmark moved from selling installable software to licensing the SaaS known as Questionmark OnDemand to enterprise organizations and governments around the world. Questionmark software is widely used in Europe, the US and internationally for e-Assessment in governments, companies, universities, colleges and awarding bodies.
Eric has also been a visionary and innovator, being influential in many of Questionmark’s innovations including in collaborative question authoring on the web, scalable and secure delivery technology, secure browsers and anti-cheating. He is co-inventor on several assessment-related patents and has written or contributed to many white papers, including the very widely read “Assessments through the Learning Process” white paper which explains different types of assessments and how they contribute to learning.
Anyone who has met or worked with Eric will be aware of his passion to generate success for those around him and to ensure that assessments are fair, reliable and trustable.