Tag: marking and assessment

Marking and Assessment projects 2024 – a crowded space

There is a sudden urgent interest in improving systems which support assessment in the University. Possibly related to the considerable impact felt from the Marking and Assessment Boycott (MAB) by the staff union (UCU) last year.

The role of learning technology systems in enabling assessment, and the student experience of assessment ( and feedback) is key.  Well-designed workflows in systems can relieve pain points and save time, – particularly in an institution with many devolved systems and practice. Systems and platforms can also be used to monitor activity and make more visible areas of overload or duplication.

We have a number of projects planned as part of our on-going programmes of Digital Estate Planning (DEP) and VLE Excellence.

Work is progressing at the start of 2024 to map the within scope, outwith scope and overlaps between the technology projects.

Choosing names for these projects is complicated because there are so many initiatives now in the area of feedback, marking and assessment, so I have divided them up into a set of  acronyms which double as a celebration of some of our historic education pioneers.

FLORA (Formal exams, Learning, Online Rubrics and Assessment)  for Flora Stevenson, one of the first women in the United Kingdom to be elected to a School Board.

LOUISA  (Learn Optimised for In-course Submission and Assessment) For Louisa Stevenson, campaigner for women’s university education and co-founder of Edinburgh’s Queen Margaret University.

SADIE  (Scoping AI Developments in EdTech at Edinburgh) for Sadie L Adams, influential Black American suffragist

LAURA (Learning Analytics in ULTRA) for Laura Willson, engineer, builder, working class hero.

PHOEBE ( Portfolios for Online, Experiential, Blogging and Evidence)  For Phoebe Byth, Edinburgh campaigner for women’s training and employment.


FLORA Project

Flora Stevenson
Flora Stevenson

FLORA (Feedback, Learning, Online Rubrics and Assessment)

Ensuring staff and students have an appropriate platform for Exams / Digital Exams

Digital tools to support assessment done under exam conditions (with an open question about what ‘exam conditions’ mean in digital contexts … could be in-person computer lab, online, take-home … and involve different types of restrictions to support academic integrity: locked browser, in-person invigilated, online invigilated, open book, various levels of time restriction).

The reliability and security of digital exam platforms is essential for delivery of high stakes elements of students’ experience at the University of Edinburgh. The current situation and digital estate add complexity, stress, burden, and confusion to the workload for both staff and students. It is not sustainable and carries several risks to university business of marking and assessment. 

Analysis work done in the Autumn of 2023 looked deeper into exams, taking testimony from Teaching Office staff across all schools, to build a clearer understanding of what exam provision looks like across the institution.  This included what role technology plays in exams, marking and exam boards.  Gaining insight into what changes are anticipated in the use of technology to support exams, marking and exam boards.  Plus looking to identify barriers to the wider adoption of online exams.   

The analysis has shown that as an institution we do not fully understand how many ‘digital exams’ take place as there is no central collation of this data, but we do know that many different types of exams involve a digital element in their workflow e.g. scanning and marking. 

This project through its various work packages will look to better ensure staff and students have access to appropriate platforms for Exams / Digital Exams.  This will include the aim that exams are not taking place on the main virtual learning environment (Learn), but are on separate, robust platform(s) designed to support assessment done under exam conditions.  The project will also examine the reasons behind institutional exam data being disjointed and present options for change.  

Why now?

  • The reliability  of assessment platforms is essential for delivery of high stakes elements of students’ experience at the university of Edinburgh. The current situation and digital estate add complexity, stress, burden and confusion to the workload for both staff and students. It is not sustainable and carries a number of risks to university business of marking and assessment.
  • A previous procurement failed, but we must try again, 5 years on,  with better knowledge and more support from the digital estate strategy governance processes.  The market (after covid) has changed and we think suppliers are more attuned to UK HE needs.
  • The ISG teams who will lead this work have successfully delivered the VLE upgrade and are ready to revisit this area now. We want to provide good assessment platforms to the University in line with business needs.
  • The project will have three elementsInstitutional gap analysis to fully understand the current picture for assessment and exam workflows at the universityOnce requirements have been established the procurement of an exam system can commence if necessaryAdditionally, the project will examine, and if appropriate procure a tool to support the marking process on digital, or digitised paper exams.   

The impact we expect on people is: 

  • improving the staff and student experience: Staff will find the new services easier and quicker to use giving them back more time to do other things.  Also, there should be opportunities to do more innovative assessment types where needed.
  • For students their assessment experience will be better – with more consistency over platform usage, giving them the chance to become familiar with them.  They should be easier to use and more reliable, reducing student stress.
  • Closer working relationships between ISG LTW with Exams Office and Timetabling unit timetabling information about exams/types of exams to allow support requirements to be pinned down in advance of diet. At the moment, it’s hard to do this, so this would be better.
  • Mitigate risks  around poor experience, poor support, high stakes data on random platforms.
  • Easing the strain on availability of physical spaces for exams/during exam periods.

The project team wanted the name of the project to be reflective of the work, memorable and to ensure ease of recognition when there are other large initiatives across the institution that may overlap with teams across the campus. 

FLORA was suggested for the pioneering history of the person Flora Stevenson (Flora Stevenson Wikipedia) and also that it could fit much of the scope of the work we are looking to take forward. 

Thank you to everyone working on starting our sister projects, LOUISA and FLORA  for our focus on how our systems are used to support assessment. If you would like to know more about some inspirational women Louisa Stevenson – Wikipedia and Flora Stevenson – Wikipedia

LOUISA Project

Louisa Stevenson
Louisa Stevenson

LOUISA (Learn Optimised for In-course Submission and Assessment)

At the November 2023 Learn Ultra project board, we discussed the business analysis that was undertaken via the Learn Ultra project, focusing on assessment and feedback practices across the University, and the workflows which interacted in/with Learn.

The business analysis identified that there was an inconsistent approach to assessment and feedback practices within and across Schools, resulting in unnecessary inconsistencies impacting on both staff and student experiences. Many of these complexities and inconsistencies are of our own making and arise from the culture and distributed nature of the institution.  It will be a challenging process of change to tackle these in ways which benefit but do not disrupt the aims of teaching and assessment. I am confident that the engagement we have done and the experience of our previous project s to optimise the learning environment will stand us in good stead. 

LOUISA (Learn Optimised for In-course Submission and Assessment) and is currently being planned as a 3-year project that will adopt a similar working partnership approach to its predecessors (Learn Ultra and Learn Foundations), working closely with Schools to understand their current workflows and practices and looking at where there are opportunities to improve. 

In Scope  

  • All courses on the VLE (on-campus and online along with undergraduate and postgraduate);  
  • All coursework within the VLE;  
  • Creation of consistent approaches to assessment and feedback within the VLE;  
  • Removal of all non-coursework assessments (such as digital exams) from the VLE;  
  • Review, design, and delivery of a suite of training courses to support with assessment and feedback within the VLE;  
  • Review and streamlining of both VLE-native and VLE-integrated assessment and feedback tools to support key assessment practices;  
  • User experience review of current assessment and feedback workflows;  
  • Programme of communications and engagement to gain buy-in; 
  • Learning Analytics  and reporting of assessment and feedback within Learn. 

Out of Scope  

  • Our other (VLE) learning platforms  such as SCP
  • Development or procurement of new tools or systems to support with assessment and feedback;  
  • Exams, Remote proctoring and invigilation ( a separate, sister project FLORA, will look at these).; 
  • Monitoring of staff performance;  
  • Developments to existing integrations and tools.  

LOUISA will build on the existing knowledge gathered around pain points in relation to assessment and feedback to help understand changes required to enhance and provide a more consistent student and staff experience moving forward.   

Our systems provide a deadline for student submissions and also a feedback return date by which time the staff need to have submitted feedback. All feedback is revealed to students (other than those with extensions) at the same time.  New functionality in Learn Ultra will help to consolidate workflows and reduce our reliance on multiple systems.

  • 3-year project beginning July 2024;
  • Split across three phases:
  • Phase one – July 2024 to July 2025: Business Analysis/User Experience and Early Adopters;
  • Phase Two – July 2025 to July 2026: Delivery of new workflows and with at-scale training;
  • Phase Three – July 2026 to July 2027: Embed and Evaluate.
  • Project closure – August 2027.

Thank you to everyone working on starting our sister projects, LOUISA and FLORA  for our focus on how our systems are used to support assessment. If you would like to know more about some inspirational women Louisa Stevenson – Wikipedia and Flora Stevenson – Wikipedia

Watch this space for updates and progress.

on your marks

Oliver Byrne. The Elements of Euclid, 1847 (c) University of Edinburgh http://images.is.ed.ac.uk/luna/servlet/s/0524y8

According to some recent on- the-ground research this is the list of tools being used by schools at University of Edinburgh for summative assessment and marking (below)

Some Schools have tried to standardise their assessment practices as much as possible, making things more consistent for students, markers and teaching office staff. In these Schools, the Learning Technologist, Teaching Office and Course Organisers work together to agree which tools are used and there is a level of central coordination of this work

Other Schools have a more devolved way of working and each course may differ in which tools and processes are used. In some cases, the Teaching Office and Learning Technologist have more limited information about course by course assessment practices.

The full cost of running so many different systems will be our next bit of research.


PebblePad ATLAS
MS Forms
Media Hopper Create
MS Teams
Final Year Rotation Feedback Tool
McGraw-Hill MCQs
Unidesk form
Sign-up tool
EMS Placement Feedback