Mind the Gap

Marking and production of feedback is probably the most problematic component of the life-cycle as it is the area where the fit between institutional processes and the functionality of commercially available IT systems is least well matched. We are hearing very clear messages from the sector that existing systems do not adequately meet institutional requirements in these areas.

The University of Manchester, Faculty of Humanities, as produced a very useful analysis of gaps between current functionality and what systems need to be able to do to support institutional marking processes (see our earlier blog post about their range of marking models). The analysis is of course based on that particular institution’s systems profile which is:

– VLE: Blackboard 9 SP14;

– Turnitin/Grademark Building Block: Basic 2.5.28 & Bb Assignment

– Student Record System: Campus Solutions/People Soft

The recommendations for enhancement therefore relate particularly to the Turnitin product. We are grateful to the University (and especially Anna Verges-Bausili) for permission to share this analysis (current as of 29/6/14). We have undertaken limited editing and renumbering to facilitate your commentary and we would like to invite your thoughts on how well this analysis matches your priorities.

1. Security requirements (Must have)

1.1 Lock down of marker comments including at Post Date;

1.2 Identification and tracking of any further comments made (user ID, date) after Post Date.

2. Requirements for administrative roles (Must have)

2.1 Ability to identify submissions by ID number

2.2 Ability to identify non-submissions or non-anonymised read only interface to allow identification of non-submitters and late submissions by administrative staff.

2.3 Ability to batch download original submissions before Post Date and to do so by ID number -not assignment title

3. Requirements for academic roles (changes expected from Tii by end of 2014)

3.1 Sampling ability: New function to select and retain assignments anonymised after post date for subsequent moderation/external examiner access.

3.2 Ability to handle large cohorts by multiple markers: Enhance the assignment by groups facility to makes moderation processes in large cohorts either between multiple markers or between GTAs, a non-manual and time consuming process.

3.3 Second marking:

3.3.1  Distinct ‘voice’, i.e. facility to distinguish second marker comments
if appropriate, e.g. Teaching Assistant as first marker, with lead academic adding additional comments before Post Date

3.3.2  Ability to record second markers mark

3.3.3  Additional ability to control visibility of 2nd marker comments

3.3.4  Ability to download second markers comments

3.4 Moderation:

3.4.1 Ability to provide online Read only access to all papers or to a selected sample

3.4.2 Ability to add comments addressed to Instructor/academic team only i.e. not being visible to the student

3.4.3 Version control on Grade (Add ‘moderated’ grade that does not override student original indicative grade)

3.4.5 Ability to download moderators comments

3.5 Blind second marking features:

3.5.1 Ability to mark second copy of assignment (with no access to first copy).

3.5.2 Ability to download second marker comments

3.5.3 Assignments by Groups view that allows the identification of moderation samples (by grade) and moderation between markers

3.6 Document viewer enhancements for marking:

3.6.1 Ability to lift/cut text from document viewer

3.6.2 Ability to resize document to individually-defined optimal size and to resize comments side bar

3.6.3 Ability to scroll down to 1-page-view at a time and mouse scroll to move faster down/up document

3.6.4 Search facility within the Document viewer

3.6.5 Ability to save and re-use general documents as well as the ability to use formatting in comments e.g. bold, underline, auto-numbering

3.7 PC offline marking

3.8 email notification of submission in some cases e.g. administrators to receive notification of submissions by students with mitigation circumstances.

3.9 Greater downloading flexibility:

3.9.1 to bulk download original student submissions before post date, and the downloaded files to retain individual ID number in file name

3.9.2 to download spreadsheet of marks before post-date

3.9.3 to download audio feedback files for both student and Institution retention.

3.10 Group assignment capability i.e. whereby 1 student in a group can submit to Turnitin and academic feedback is returned to all students in the group.

4. Enhancements (Should have/could have)

Should Have

4.1 Selective word count i.e. word count that excludes references and footnotes.

4.2 Greater file upload limit

4.3 Ability to mark with native applications

4.4 Ability to provide feedback with mathematical characters/formulae

4.5 Compatibility with speech recognition software.

4.6 Ability to upload a file with comments

4.7 Process overview interface: ability to identify at which stage in the assessment processes student submissions are

4.8 External examiner specific requirements: ability to grant to external examiners view access only, ability to identify a moderation sample and/or restrict access to only to some submissions in Assignment inbox.

4.9 Marking with a wider range of tablets (other than iPad) e.g. Android

5. eLearning Reporting Requirements

To be able to overview uptake of online submission and online marking, a monitoring/reporting functionality is needed to be able to identify courses across Schools/Faculties that are using Turnitin and those courses which are using Grademark.

4 thoughts on “Mind the Gap

  1. Gill Ferrell Post author

    Here’s another list of what’s missing from existing assessment management systems (sent to us by Rachel Forsyth, MMU):

    • Pulling in data from existing systems which specify assessment arrangements (e.g. our unit outline database or our coursework receipting database) – not making people do things twice
    • Managing ephemeral and examination assignment types
    • Managing objects which aren’t paper
    • Facilities to distribute assignments for moderation in a user-defined way
    • Logging moderation activity
    • Facility to allow anonymous marking.
    • Facility to collect feedback for re-use e.g. sending to personal tutors, collating for unit leader, searching for keywords
    It would also be nice if the student record system could cope with doing statistical analysis on marks, but maybe some do already, I’m only familiar with ours, and that should be in the SRS and not in an assessment management system.
    A good system would allow for the institution to determine its own moderation practices (which might vary at the level of a department, as the QAA requirement is only to have a policy in place, not that it has to be institution-wide) and of course would allow for an interface to different VLEs and Student Record Systems.

  2. James Trueman


    I will forward a separate list for inclusion – although am making two comments here.

    1) ‘Marking and production of feedback’ has been highlighted as probably the most problematic component of the life cycle. I am not sure that is the case for us. In our experience, whilst it is the dominant academic process – and rife with challenges which must be addressed (e.g. double marking, multiple academic access for moderation etc), the most problematic areas centre on the administration of the assessment through the lifecycle.

    Our LMS has poor integration with our assessment system (Turnitin), which accounts for some of our issues. As I understand, even those system which do have integration suffer in areas (see point 2). We want to be able to create an assessment once in the system – and associate students to that – and from then on – the system should be able to set up the appropriate submission, extension/late submission, resubmission points automatically. Students with learning needs should automatically be treated as necessary, changes to deadlines should automatically filter through the EMA system and students who have had mitigation claims upheld should automatically be flagged (at the minimum) or given access to the correct new submission point (as a target behaviour). Grades and feedback need to be automatically extracted from the ‘marking’ platform and transferred to the SRS. The manual involvement in managing these processes (even in integrated systems) is very costly, and open to human error.

    2 I note that ‘extensions’ does not feature anywhere here. Whilst I am aware that institutional policies vary, I am not aware of any that truly does not offer a ‘late submission’ process – and therefore that the system needs to be able to deal with this in a more sophisticated way. A dominant player (Turnitin) does not (I do not know about BB and Moodle’s own marking systems).

  3. Gill Ferrell Post author

    James (and others) to what point do you think the ability to deal with extensions is a technical one? The message I have been picking up from a lot of comments in our online questionnaire is that lack of clarity about institutional processes is the main cause. This is some draft text on the subject and I welcome your thoughts.

    As regards institutional processes, there are many issues around managing extensions and extenuating circumstances. Some organisations believe they have clear institutional policies but find that interpretation of those policies varies widely between departments. The variability in how this is approached within and between institutions makes it difficult for system suppliers to build in functionality to apply coding for managing extensions and extenuating circumstances and/or penalties for late submission.

  4. James Trueman

    There are two issues here, 1) varying policies / institutional process, and 2) technological challenge.

    1) I understand that there appear to be variations of practice (and/or interpretations) or lack of clarity – and institutions may feel that this is a difficulty in their experience. That is broadly not the case at my HEI, as we are challenged more by the basic ‘technical’ ability of the EMA we are using (Turnitin).

    We have a simple submission / extensions policy – submit by the deadline or request an extension to your deadline (before it passes). A central team of Student Advisors manage that extension request process, which limits variation (accepting we are human not robots), and the regulations are clear that anything submitted late, without an extension, fails (the student does also have access to mitigation after submission). This is a very firm stance, but it is what our regulations express.

    About 8 years ago we did have a graduated penalty system, and I believe because of perceived inconsistencies of application, this was replaced by the current system. However, we changed the system in a purely ‘manual era’. Within the context of an EMA, there could be / should be less variation. Student submits – academic grades (or not at all if auto fail is applied) – EMA manages any penalty based on regulations and conditions set (or claim is heard by designated panel – see below). Variation should drop through the floor?

    2) With that in mind, I appreciate that system suppliers may feel challenged to offer a technical solution that is flexible to cater for all HEIs, but at the same time, I disagree with this.

    Many HEIs use Turnitin – and the GradeMark feature. Currently, Turnitin contains students in groups – and applies a single deadline to all students – i.e. individual deadlines (e.g. extensions) are simply not possible in this system – the same deadline applies to students in the group they are enrolled to. That is the single most frustrating issue we face – and the reason why this is a technical issue.

    As many (most?) institutions integrate with Turnitin, their technical challenge in this area is the same. The ‘assessment’ platform they are interacting with does not recognise that individual students may need to have different submission deadlines.

    Most HEIs offer one form or another of late submission system – as I understand, most likely summarised as A) Extensions, B) Penalty, and C) Late submission with claim.

    A) Therefore, in the first instance, the EMA system needs to be able to easily and responsively manage students with individual deadlines. As indicated, currently GradeMark cannot at all. From a quick check, it appears that from Moodle 2.3 – individual extensions could be offered, and I think adaptive release in Blackboard offers a system (not sure how easy that is?). Ultimately however, managing submission deadlines is bread and butter practice – and should be so easy (technically), we do not have to think about it (or really do anything about it – i.e. extension awarded on SRS – and EMA system adjusts automatically). From what I am told, this is not the case.

    B) In the second case, the EMA should be able to apply a penalty on a late submission. It seems two main versions of this exist, 1) Grade reduction and 2) Capping. Grade reduction slowly drops a percentage off of the overall grade either on a daily or weekly basis – usually up to a second deadline. Capping practice seems similar – and can be summarised as an automatic reduction to a set grade, if submitted within a specified time. I understand capping is also operated with a claim system (before or at submission).

    I’m not sure how the main platforms deal with this – but it is something an EMA system should have as it core functionality.

    As far as this being too variable for system providers to offer, I have seen a Jisc funded project that developed through two institutions. Within that was the ability to allow individual late submissions, and set penalties and the conditions for these. A relevant ‘marking penalty’ example includes – ‘handed in late without approved extension’ (this of course could be worded more broadly) – a penalty could be auto applied (or manual) and conditional options included ‘one application’, ‘once per day until student submission’, ‘once per week until student submission’, ‘applied once, and the student’s work capped at the set value’. An easy addition could be ‘once per day/week until new deadline’. These options would accommodate many systems I have seen.

    If a relatively small scale project can produce this (in comparison to the significant investment major system suppliers can and do undertake into their platforms), those supplying EMA to HEIs should have this feature set already in the bag, or at least be offering much more than they are now. Whilst I am sure that there much more complex penalty systems in use, I am sure a reasonably flexible set of parameters could be provided to meet most user types. (Do you have data on all of the ‘late submission’ management system in use?)

    C) The third approach is late submission with claim – commonly contained within ‘extenuating circumstances’ policies. These seem to operate on the basis that the claim is heard by a designated officer / panel after submission (and commonly it seems after marking), and usually the awarded grade goes forward if upheld. Given that this is a human process, the main requirement of the EMA is to allow late submissions. In cases where a claim has to be submitted with the assessment – some HEIs require that the student upload that with the assignment. As long as the EMA offers more than one file per ‘assignment’, then the HEI can set up the system their end to accommodate this (can definitely be done in Moodle, but not currently available in Turnitin via the web – although a new feature they are going to release).

    To bring these two points back together. It seems that any lack of clarity or variations in internal ‘process’ issues can be addressed by the HEIs – but the HEIs cannot make the required ‘technical’ changes to a propriety EMA platform, which they are trying to use. Indeed, whatever ‘process’ issues challenge institutions with respect to late submissions are redundant, if the EMA will not ‘technically’ allow the HEI to easily and flexibly manage individual student submissions.

    In short, our practice is in our control, the supplied EMA platforms are not.


Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *