EMA in HE: processes draft for comment

This is a draft version of content for a new Jisc guide on EMA processes and systems that will complement the existing guide. Here on the blog we have split the two to help readability. The drafts will be open for feedback to help us improve the guide until 5th Jan 2016, so please do share your thoughts using the ‘comment’ function below.

EMA in higher education: processes and systems 

This guide forms part of our suite of Assessment resources. It is intended to help those of you in HE review and improve the business processes that support your assessment and feedback practice and make the right choices as regards supporting information systems. We have created this guide for two main reasons:

  • Our research has shown that many institutions struggle to get the most out of their information systems due to the variability and complexity of their business processes
  • Both suppliers and institutions tells us there is a need to see core UK wide system requirements clearly expressed

What’s in this guide?

This guide contains resources to help people who manage assessment processes from both the academic and the administrative point of view, people who manage the supporting EMA systems and suppliers of EMA systems (ie systems supporting online submission, marking and feedback of assignments). It includes:

  • A simple overview of the submission, feedback and marking process concentrating on the core tasks that all institutions need to carry out
  • A set of requirements that specify what EMA systems need to be able to do to meet these core UK requirements
  • A set of prompts to help you compare your own processes to our model and analyse where improvements could be made

The guide should be used in conjunction with our description of the assessment and feedback lifecycle in the guide to Transforming assessment and feedback with technology.

What are we trying to achieve?

The purpose of this guide is to help you implement processes that meet all necessary quality standards but which are no more complex than they need to be. Principles underlying the approach are:

  • Ensuring tasks are carried out by the right people. We have identified tasks as being: a learner responsibility, an administrative matter or a task requiring academic expertise
  • Automating any routine administrative tasks that can be automated. Digital information immediately opens up possibilities for streamlining the manual workload associated with dealing with vast quantities of paper
  • Ensuring processes are undertaken for sound academic reasons. We hope you will use these resources to challenge complexity that exists for no better reason than “We’ve always done it that way”.

We can help you identify what a process needs to achieve, what skill set is needed and where information systems can help. What we can’t do is design standard workflows that will suit your institution. Individual workflows will depend on how your organisation is structured and what information systems it uses. You may well have different workflows for different assessment situations and this is fine so long as you are sure there is a valid reason why each variant exists.


What are the common problems?

Variation in approach across the institution

Research for our 2014 EMA landscape review[1] showed that responsibility for assessment and feedback policy and procedure is often devolved to local level within institutions. What this means in practice is that large institutions rarely carry out a particular function by means of a single, institution-wide, business process. Different faculties, schools, departments or even programmes, each have their own ways of doing things. This level of process variation is an inhibitor to achieving the efficiencies and benefits possible through the application of EMA technology because a series of time-consuming and cumbersome workarounds are likely to be needed to adapt the system to many different ways of carrying out the same activity.

Technology ‘bolted on’

Participants in our research frequently commented on the extent to which new technologies are ‘bolted on’ to old processes without people really taking the time to stand back and consider what the process is really intended to achieve.

In some cases poor process design is due to lack of time and appropriate skills. During the Jisc assessment and feedback programme, a concern was voiced that academic staff often find themselves on a ‘treadmill’ due to poorly designed processes. Their workload is such that they cannot pause to think about doing things differently. They also recognise that they do not have the skills to undertake process review and redesign without some more specialist support yet they know that they cannot improve their pedagogy without better designed processes.

Organisational myths

In other cases a significant part of the problem is the persistence of ‘organisational myths’ surrounding policy and process. The tendency to do things the way they have always been done is perpetuated by a belief that this is somehow enshrined in local or institutional policy. When challenged on existing approaches, academics are often surprised to find that many characteristics of the process are matters of historic choice rather than regulatory issues and, indeed, often surprised at how few regulations there actually are or how easy it is to make changes to perceived blocks and barriers in the regulatory frameworks. Variation in the application of assessment policy across an institution is often down to such myths about what actually constitutes the policy in the first place.

Example: Impact on student experience

Different approaches to carrying out the same task impact not only staff workload but also the student experience. As an example one institution that had the capacity to accept e-submission of all assignments based on the written word noted the following variations:

  • One faculty accepted e-submission for postgraduates only but then printed out the assignments for marking
  • Some course teams were happy to accept and mark submissions electronically but students were still required to submit a paper copy to meet the requirements of the coursework receipting system
  • One department required students to submit a hard copy for marking and also an electronic copy to be submitted through the plagiarism detection system

Simple EMA process overview: submission, feedback and marking in 10 steps

Do we really believe you can ever reduce this complexity down to 10 steps? That’s the challenge for you. We aim to show that there are 10 core tasks in the process and that by making good use of EMA systems you can cut out much of the manual intervention that consumes resource, introduces error and adds little value to the learning experience.

Shown below is our overview of the submission, marking and feedback process. Click on the image for an enlarged view.

Submission, marking and feedback process i2 - New Page

his model covers all types of summative assessment where there is a mark given as well as feedback. It also covers iterative processes where students might undertake formative checking of their own work using text matching tools to review academic integrity, might undertake self or peer review and might be required to show evidence that they have engaged with their feedback before their mark is released.

Why are there 11 task boxes rather than 10? Call it cheating but we decided that ‘Apply penalty or mitigation’ is single task but might occur at different times depending for example on whether the penalty was for late submission or academic misconduct.

This is a high level overview. There is no right or wrong way to draw a process map – you need to choose the level that is right for you. For example you might choose to break down some of the sub processes further. Find out more in our guide on process mapping.

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit

Submission process detailed view

Below is an example of the submission phase broken down into more detail. This time we have identified the role of the EMA system to show which tasks can be fully automated. If you follow the link to the interactive version and hover over the question marks you will see hints and tips about managing the task.

http://ovod.net/wilbert/sundry/ema/submissionSwimLaneModded.html#

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit

Marking and feedback process detailed view

N.B. Interactive view similar to the above to be added

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit Marking and feedback process i2

Improving EMA processes

We suggest you use our 10 step model (and the detailed breakdowns) to compare against your own practice and ask yourself the following questions:

  • Are you doing additional tasks – if so, why?
  • Are the tasks being done by the right people eg do you have academic staff undertaking administrative duties that do not require academic judgement?
  • Do you have systems that could carry out some of the tasks you are doing manually?
  • Do you have multiple ways of performing the same task – if so, why?

You can find out more about ways to help improve your processes in our guide to process improvement.

Our guidance on process mapping includes some simple and user friendly techniques that can be used with all types of stakeholders to analyse issues with current approaches and suggest and evaluate ideas for change. In particular some of the techniques you might consider are: flow charts with swim lanes, rich pictures and RAEW (Responsibility, Authority, Expertise and Work) analysis and details on all of these can be found at the end of the process mapping guide.

You can also find out about how some universities have benefited from so-called Lean approaches at the end of our section looking at what is a process?

Examples from across the sector

Here are some further example of how different universities have approached the mapping of assessment and feedback processes to help them review practice:

N.B. Examples will be linked from the guide.

  • The University of Oxford, department of continuing education used a swim lane approach with all actors identified developed in Excel. CASCADE workflows
  • The University of Hull has created a more detailed version of our overview with its individual systems identified. Hull Ideal process
  • The University of Sheffield, department of town and regional planning, used before and after process maps to review assignment submission. This is an example where the department identified a continued need to support marking on paper. Sheffield TRP Assignment Submission Process    Sheffield TRP Assignment Submission Process – To Be
  • University X has used our lifecycle to produce a roadmap for an ambitious development programme over a three year period.

 

[1] Ferrell, G. (2014) Electronic Management of Assessment: a landscape review

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *