EMA supporting assessment for learning at the University of Hertfordshire

Background and context

The University of Hertfordshire is a large HEI with a student community of over 27,000 including more than 2,800 international students from 85 different countries. The University prides itself on being student-focused with an established reputation for using technology to support excellence in learning, teaching and assessment. Participation in the Jisc assessment and feedback programme (2011-2014) provided a timely opportunity to build on the institution’s existing expertise and to take forward various pilot activities in the use of technology enhanced assessment. This case study takes a broad look at the University’s EMA activities in recent years drawing heavily on the outcomes from Jisc supported work particularly the Effecting Sustainable Change in Assessment Practice & Experience (ESCAPE) and Integrating Technology Enhanced Assessment Methods (ITEAM) projects.

EMA implementation

The University supports the creation of a culture where technology enhanced assessment underpinned by good pedagogic practice is the norm. Four drivers in particular have defined the need for projects to address various aspects of EMA:

  • The critical role assessment and feedback plays in supporting learning, developing students’ self-regulation and ultimately enhancing student progression and success.
  • Students nationally and locally identifying assessment and feedback as their least satisfactory aspect of their university experience.
  • The likelihood of increased student expectations related to their education and academic support following the introduction of higher fees.
  • The increased focus on resource efficiency and a need to understand how technology enhanced solutions can be both educationally effective and resource efficient.

The overall approach to EMA related projects is guided by two main imperatives:

  • institutional alignment and
  • disciplinary and context relevance.

The University has undertaken interventions in many parts of the assessment and feedback lifecycle.

  1. Specifying

The University has been a leader in the field of principle-led change and in the move from assessment of learning to assessment for learning. The University has an agreed set of 6 Assessment-for-Learning (AfL) principles which provide the pedagogic framework for assessment design. They are:

  • Engages students with assessment criteria
  • Supports personalised learning
  • Focuses on student development
  • Ensures feedback leads to improvement
  • Stimulates dialogue
  • Considers staff and student effort

In June 2012, the AfL principles were incorporated into programme validation documentation to form an important cornerstone of the validation process. Embedding the principles in this way ensures that discussions about assessment happen early on in the curriculum design process and are scrutinised by the validation panel before the programme is signed off. Members of the Learning and Teaching Institute (LTI) are involved in the validation process to give support to programme development teams on all learning and teaching matters including assessment and feedback. Further support is available via centrally run workshops on assessment which are available at programme and module level.

Information about the AfLprinciples and how they were developed can be found in resources produced by the ESCAPE project (Effecting Sustainable Change in Assessment Practice & Experience). The University has also produced some excellent guidance materials on implementing the AfL principles.

  1. Setting:

The principles of focusing on student development and ensuring that feedback leads to improvement can only be implemented when the overall assessment timetable permits this type of longitudinal development. The University has tackled issues such as a lack of formative opportunities and feedback occurring too late to inform future assignments by producing a ‘modelling tool’ that has proven useful in reviewing assessment practice and particularly identifying issues with the overall assessment timetable. The assessment timelines tool is used to model patterns of high medium and low stakes assessment across a 12 week semester: a collection of different assessment patterns are available to download and use and the consequences of each are explored in guidance materials.

Link to download the assessment timelines tool.

One of the AfL principles that applies particularly to the setting of assignments for a particular instance of delivery of the curriculum is that assessment considers staff and student effort. The University has produced a tool known as the Assessment Resource Calculator which illustrates the time impact of different assessment strategies in relation to student numbers. It might help a tutor to decide for example, whether a group presentation with peer and tutor evaluation might be more or less time consuming than an essay for a particular group size. This can only help inform the process of making academic judgements e.g. the tutor would still need to decide whether one 4000 word essay might be more or less pedagogically effective than 2×2000 word essays.

Link to download the assessment resource calculator.

The University also encourages clarity about assessment and grading criteria as part of the setting process (also of relevance to the following stage of the life-cycle ‘supporting’). One of the ITEAM project objectives was to embed the use of grading criteria in the online submission system and promote their use more widely across the institution. The project produced an online interface that allows academics to select the appropriate grading criteria for each assessment (a significant change to the previous interface) and supported the development of School level grading criteria ready for upload onto the revised system. The use of grading criteria sets at School and/or programme level ensures alignment between feedback statements and numerical grades and therefore improves the quality and consistency of assessment feedback.

Link to an exemplar set of grading criteria.

  1. Supporting:

An important part of support for students is increasing the amount of formative assessment opportunities especially those where feedback is sufficiently timely to inform work for future assignments. The University of Hertfordshire is using technology to support formative learner development in a variety of ways particularly through the use of electronic voting systems (EVS).

EVS (also known as response systems or clickers) is classroom-based technology which can be used to support learning, teaching and assessment in a variety of ways. The technology comprises a handset, receiver and a software interface which uses a PowerPoint™ add in to enable the creation of question slides. Questions are written in the format of choice e.g. multiple choice, Likert scale, True/False statements and delivered as part of a classroom-based session with as many or as few questions as desired. Students issued with handsets can vote their responses in when the polling option is ‘open’. The teacher controls the pace of the session and the display of results.

EVS has been used at the University of Hertfordshire for around 8 years with 120 EVS ready classrooms. EVS is used to support a multitude of teaching strategies including the assessment of knowledge and understanding, exploring values and beliefs, seeking consensus, mediating debates and facilitating peer assessment.

Advantages for students include the anonymity that EVS brings; this is particularly useful for those students who are less confident, articulate or language-proficient than their peers. It ensures the whole class has an opportunity to engage in learning activities as well as promoting two-way interaction between teacher and student. The other important advantage for students is the speed at which feedback can be delivered for questions with right and wrong answers. This tells students exactly what they are doing well and where they need to revise. The immediacy of the feedback also gives teachers valuable information about class performance enabling them to adjust the session content according to the responses given.

Resources produced by the University include:

  • An EVS lifecycle which provides a reference tool for introducing similar technologies. It considers resource efficiency through centralisation of processes e.g. handset registration and centralised procurement. Flow charts are used to illustrate specific parts of the life cycle e.g. managing loss and return of handsets.
  • A set of EVS Top Tips cards for staff.
  • A set of case studies showing how the use of EVS supports particular pedagogic principles and assessment activities in different disciplines.
  • Guidance on EVS use in relation to inclusivity and disability.

Editor’s note: This is an area where technology is advancing rapidly and it may be that the use of smartphone technology will eliminate the need for specific EVS handsets in many institutions in the near future but the points in relation to supporting the implementation of specific assessment for learning principles and the benefits for students remain the same.

  1. Submitting:
  2. Marking and production of feedback:
  3. Recording grades:
  4. Returning marks and feedback:

Online submission, feedback and marking are mandatory in some schools although there is no University wide policy on this. The University is relatively unusual in having an in house managed learning environment (StudyNet) as well as its own in-house systems for online submission, marking and feedback hence we have covered its experiences in these elements of the life-cycle in only very general terms. The inclusion of grading criteria in the online submission system was one of the main developments of the ITEAM project. In addition, students can now see detailed feedback as a result of a new interface for feedback which has also been implemented within the University’s online submission system.

  1. Reflecting

The University aims to support both staff and student reflection by integrating student engagement and performance data into a Student Progress Dashboard to provide timely, holistic reports for each student and their personal tutor about the student’s engagement/performance on all the modules he/she is studying. The SPD uses assessment grades from assessments submitted online and the numbers of hits within a module site to give an indication of the students’ overall performance.

There are separate staff and student views. The staff view enables tutors to look at assessment and engagement indicators for whole cohorts or individual students. A traffic light system gives an ‘at a glance’ view of which students might need additional support as well as those who are doing well. The student view shows students the same indicators in relation to the student’s own performance as well as their performance in relation to their peers. This enables students to see where they may need to seek extra support or guidance.

The SPD was developed and piloted with staff and students in the academic years 2012 and 2013 and the dashboard is now available to all programme teams on request. Full rollout to students has been delayed due to the need to ensure that staff are fully trained to support students using the dashboard and to allow further development work to ensure that the student facing element are not open to misinterpretation and that signposting for support and resources is readily available.

Most importantly these developments have taken place at a time when the potential for learning and assessment analytics is beginning to be explored by the sector and the SPD will pave the way for further work in this area.

Benefits

The University has seen many benefits from the use of EMA across the assessment and feedback lifecycle.

  • Learning and teaching practice has been enhanced as a result of the awareness raising and dialogue that has taken place around the assessment for learning principles.
  • Good practice has been sustained and embedded by means of the formal adoption of the principles by the University and their integration into academic quality processes.
  • Staff confidence in the use of technology to support assessment and feedback has increased due to engagement in the projects and staff development activities.
  • Consistency and quality of feedback to students has improved through the integration of grading criteria into the submission, marking and feedback system.
  • Greater formative support and opportunities for students have been provided through the use of technologies such as EVS and QMP.
  • Assessment processes are becoming more efficient through the judicious use of technology and these efficiencies can be measured through use of the assessment resource calculator.
  • Assessment processes are becoming more efficient through development of institutional processes to manage procurement, licensing and day-to-day management of the technologies (e.g. lost and replacement EVS handsets).
  • Communication between different service providers and their users has improved through the involvement of a wide range of stakeholders in the projects and this is leading to improvements in overall support for students.
  • The student progress dashboard supports better decision-making in relation to assessment practice as well as targeted support for individual learners.

Find out more:

  • Resources from the ESCAPE (Effecting Sustainable Change in Assessment Practice & Experience) project.
  • Resources from the ITEAM (Integrating Technology Enhanced Assessment Methods) project.
  • Link to a presentation on the efficiency of assessment and recording of the webinar at which this topic was discussed. N.B. the webinar link will require you to download the Blackboard collaborate launcher before the recording will play.

Download case study as a pdf Case Study Herts i1

Leave a Reply

Your email address will not be published. Required fields are marked *