6. Recording grades

Given the complexity of marking processes, it is unsurprising that there are considerable variations in the process by which a definitive grade is stored against a piece of work. Institutional regulations will determine who records the grade, how this is verified and in which system it is stored. However, in many cases, the student record system is the definitive source of grading information and a lack of interoperability between this system and other components of the institutional EMA toolset can be the source of many problems.

Transcription errors are nothing new (common problems are that 7s and 1s get mixed up) but they are unlikely to be eliminated whilst marking is still frequently done outside the core system (for reasons discussed above) and/or a lack of system integration requires manual intervention to transfer marks from one system to another.

The problems of manual intervention are often exacerbated by the fact that academics simply do not trust in the ability to edit central systems as needed and prefer to keep marks elsewhere ‘under their control’ until all adjustments have been made and marks have been verified (see comments below on accidental release of marks for reasons why these concerns are valid). In many cases the moderation process is carried out on shared drives and by exchanging emails back and forth but we heard of one instance where academic staff had opted (against University policy) to use the student record system for the moderation process as this was perceived to be the only suitable shared area available to them: changes to make the student record system more open to students for other reasons thus had the unforeseen effect of enabling students to see the moderation process in real time. N.B. A product new to the UK market, the Canvas VLE, has a very open pedagogic approach such that student names are visible to markers by default and students can see grades and comments as soon as they are entered (users often forget to turn off these options in order to comply with local policy).

The ways in which systems record and store marks can also cause issues for many institutions whose grading schemes do not match the way the software is configured. The QAA (2012) states: There is a strong body of opinion that the use of numbers to judge the achievement of learning outcomes is inappropriate.’[1] yet systems are still set up to expect percentage scores.

‘However, as our policy dictates that letter grading should be used instead of percentage marking (e.g. B2 instead of 65%), this causes extra administrative workload as Turnitin currently does not support letter grades which means that grades need to be added on a spreadsheet manually.’

‘One issue was the fact that marks had to be % rather than pass/fail/refer and this has been problematic.’

There are also concerns about the rounding of numeric marks and the possibility that double rounding of marks in different systems can give an inaccurate result.

There are various information and records management issues to be addressed when implementing EMA. One of these concerns the need for a comprehensive audit trail throughout the marking and moderation process. In most cases it is insufficient to know simply that a mark has been adjusted as a result of moderation: there needs to be an audit trail of the actual marks before and after moderation and this seems to be a weakness in current EMA systems.

‘The platform really needs to be able to show more clearly what has ‘happened’ to a submission during the assessment cycle.’

A number of participants in this research also commented that archiving and retention policy is an issue that they need to see addressed in relation to EMA. Previous Jisc work on managing student assessment records by Northumbria University back in 2003 largely predates significant EMA developments. Institutions are finding that as they are managing more submissions and feedback electronically, the assignments are only available in digital format and there is currently no automatic way to archive the material so it has to be manually downloaded.

Questions about data ownership were also raised at the Think Tank. Many institutions seem to be unclear about the detail of their licence agreement with Turnitin in particular. The view is that students own the transcripts but Turnitin owns the originality reports and there was considerable uncertainty as to how much archive data could be recovered if, for example, an institution wanted to move to a different originality checking product.

MMU guidance on Recording grades.


[1]In particular see: Rust, C (2011) and Yorke, M (2009).

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *