FLORA (Feedback, Learning, Online Rubrics and Assessment)
Ensuring staff and students have an appropriate platform for Exams / Digital Exams
Digital tools to support assessment done under exam conditions (with an open question about what ‘exam conditions’ mean in digital contexts … could be in-person computer lab, online, take-home … and involve different types of restrictions to support academic integrity: locked browser, in-person invigilated, online invigilated, open book, various levels of time restriction).
The reliability and security of digital exam platforms is essential for delivery of high stakes elements of students’ experience at the University of Edinburgh. The current situation and digital estate add complexity, stress, burden, and confusion to the workload for both staff and students. It is not sustainable and carries several risks to university business of marking and assessment.
Analysis work done in the Autumn of 2023 looked deeper into exams, taking testimony from Teaching Office staff across all schools, to build a clearer understanding of what exam provision looks like across the institution. This included what role technology plays in exams, marking and exam boards. Gaining insight into what changes are anticipated in the use of technology to support exams, marking and exam boards. Plus looking to identify barriers to the wider adoption of online exams.
The analysis has shown that as an institution we do not fully understand how many ‘digital exams’ take place as there is no central collation of this data, but we do know that many different types of exams involve a digital element in their workflow e.g. scanning and marking.
This project through its various work packages will look to better ensure staff and students have access to appropriate platforms for Exams / Digital Exams. This will include the aim that exams are not taking place on the main virtual learning environment (Learn), but are on separate, robust platform(s) designed to support assessment done under exam conditions. The project will also examine the reasons behind institutional exam data being disjointed and present options for change.
Why now?
- The reliability of assessment platforms is essential for delivery of high stakes elements of students’ experience at the university of Edinburgh. The current situation and digital estate add complexity, stress, burden and confusion to the workload for both staff and students. It is not sustainable and carries a number of risks to university business of marking and assessment.
- A previous procurement failed, but we must try again, 5 years on, with better knowledge and more support from the digital estate strategy governance processes. The market (after covid) has changed and we think suppliers are more attuned to UK HE needs.
- The ISG teams who will lead this work have successfully delivered the VLE upgrade and are ready to revisit this area now. We want to provide good assessment platforms to the University in line with business needs.
- The project will have three elements. Institutional gap analysis to fully understand the current picture for assessment and exam workflows at the university. Once requirements have been established the procurement of an exam system can commence if necessary. Additionally, the project will examine, and if appropriate procure a tool to support the marking process on digital, or digitised paper exams.
The impact we expect on people is:
- improving the staff and student experience: Staff will find the new services easier and quicker to use giving them back more time to do other things. Also, there should be opportunities to do more innovative assessment types where needed.
- For students their assessment experience will be better – with more consistency over platform usage, giving them the chance to become familiar with them. They should be easier to use and more reliable, reducing student stress.
- Closer working relationships between ISG LTW with Exams Office and Timetabling unit timetabling information about exams/types of exams to allow support requirements to be pinned down in advance of diet. At the moment, it’s hard to do this, so this would be better.
- Mitigate risks around poor experience, poor support, high stakes data on random platforms.
- Easing the strain on availability of physical spaces for exams/during exam periods.
The project team wanted the name of the project to be reflective of the work, memorable and to ensure ease of recognition when there are other large initiatives across the institution that may overlap with teams across the campus.
FLORA was suggested for the pioneering history of the person Flora Stevenson (Flora Stevenson Wikipedia) and also that it could fit much of the scope of the work we are looking to take forward.
Thank you to everyone working on starting our sister projects, LOUISA and FLORA for our focus on how our systems are used to support assessment. If you would like to know more about some inspirational women Louisa Stevenson – Wikipedia and Flora Stevenson – Wikipedia
One comment
Comments are closed.