Have your say!

At our workshop in December some of you worked with us to develop a range of solution ideas to tackle the prioritised challenges, which were refined down into five concept areas. At our second workshop in January more of you collaborated with suppliers to work up each of these further. We would now like to share these ideas with you here so that you can vote and comment on which you feel would have most impact for you, and the sector, if delivered. Your comments will feed into the process of deciding which ideas will be taken forward.

Please add your comments to this post, along with the number of the idea that you feel would have most impact if taken forward!

1. EMA Requirements Map
To address the challenge that systems don’t always fully support the variety of marking and moderation workflows in place: this is a project to identify, validate and specify the sector’s key EMA requirements and workflows. The aim here is to provide clarity and transparency around assessment and feedback workflows (looking at the whole assessment and feedback lifecycle, but particularly around the period from submission to return of grades). This will help assessment systems suppliers better design systems that support good pedagogic practice as well as helping institutions review their own practice. The project would seek to identify common workflows and significant variables in collaboration with universities; consolidate and further analyse the workflows; develop a visual way of presenting workflows; map current systems to these workflows; and engage with suppliers and developers to fill the gaps.

2. Feedback hub
The development of a system independent, virtual tool/plug in for aggregating, organising and presenting feedback (and marks) at a programme level, wherever they may sit, for both staff and students. The tool should also enable interaction around the feedback between staff and students.

Students would benefit from an aggregated view of their feedback to support self-reflection on progress; lecturers would see a more holistic view of students’ progress and be able to better understand an individual’s progression and better identify where intervention or support was needed. This holistic view could also enable more effective and efficient tutorial and supervisory processes.

3. Reliable submissions
To tackle the documented problems associated with system failures at critical submission points it was suggested there is a need to decouple the physical act of submission from the workflows within other EMA systems. This solution idea proposes the development of a submission tool (customisable by institutions) which includes a front-end asynchronous submission and receipting service, with back-end post-submission processing, so that submissions can be acknowledged and held until other functions are in a position to proceed. Policies, procedures, guidance and examples need to encompass the workarounds to deal with points of failure.

4. EMA systems integration web resource
To address the problems of the lack of interoperability between marking systems and student records systems and subsequent need for ‘workarounds’ by staff, a resource was proposed which would help institutions find solutions to EMA systems integrations issues, which could relate to both workflows (see above ‘EMA requirements map’) as well as actual existing integrations (use cases including advantages, limitations etc. and code). Where there are gaps identified through exploring existing integrations, this resource would enable them to be surfaced and prioritised for potential development. A ‘community of practice’ would support the resource.

5. Assessment and feedback toolkit
A web-based ‘toolkit’ of searchable resources (case studies, stories, staff development resources, tools etc.) based around the assessment and feedback lifecycle in an interactive form. The resource would aim to provide examples of solutions to assessment and feedback problems, enabled by technology, based on pedagogy and underpinned by research, and be able to be re-purposed for local contexts. It would aim to address the question of ‘what does good assessment design look like?’ and to enhance the assessment literacies of staff and students.

16 thoughts on “Have your say!

  1. Phil Vincent

    5. Assessment and feedback toolkit

    I think a toolkit like this would certainly be the most beneficial to us, particularly some thing that can be taken, adapted/modified etc. under a CC licence, so that institutions can add their own contextual info about policies & procedures etc.

    I think it would be relatively quick & easy to crowd-source examples of good of good practice and bring this all together.

    Phil

    Reply
  2. Ian Barrett

    Thanks for organising this. There are several valuable potential projects here, and I hope more than one can be taken forward. My vote is definitely for (1) though – I think this is essential for reframing the relationship between the sector and its suppliers in this important area.

    Reply
  3. Phil Scown

    If we take a business analysis view of “assessment” we can see it’s both strategically important and complex. This makes it something that shouldn’t be outsourced, and requires well designed support processes to support the central human activity. Relying on a complex central EMA system is akin to outsourcing by proxy. Such outsourcing makes us vulnerable to the weaknesses of the supplier, and introduces an additional communication layer further reducing both flexibility and reliability of the system as a whole. Higher Education assessment needs to be very flexible to adapt to new knowledge, and it requires creativity. Over-reliance on an external supplier, or a “big system”, will make the work of assessing academics more difficult as additional time delays and processes will be introduced. These are known to dampen responsiveness and creativity.

    There is an additional problem where external suppliers are involved. Their job is not to create assessments, or to come to decisions about assessment, but to make money from assessment. Their perspective is different to ours. It is in their interests to state that there is a technological solution to our problems, and that they can supply it, for a fee. Once an institution is committed to a particular system then it has little room for maneuver, as the cost of change is too great – and any alternative may be no better.

    As an individual academic I see the best contribution from technology as simply the administration of marks. Complex spreadsheets if you like. Anything more makes the assessment process worse not better. To make using the EMA system seamless, very highly reliable, and useable, would result in uneconomically expensive systems. Since it is beyond the technical capability of organisations to provide this, and beyond the will and resources of HE institutions to afford this, correctly so, then we shouldn’t introduce them. The suppliers might argue that institutions could afford this if they collaborated. This would result in an expensive systems on a one-size-fits-all basis; not flexible enough. Profitable for the suppliers though.

    Assessment is central to what we do. It’s a creative process requiring flexibility. Let’s not deskill it by forcing academics to wear the straightjacket of expensive, under-performing technology.

    Reply
    1. Terry McAndrew

      I have to disagree. Surely there are many more opportunities that just the administration of marks – just because it’s a technology does not mean we cannot be creative. There are opportunities for rapid formative testing, confidence-based testing, adaptive testing, accessible testing, collaborative testing, re-useable testing etc. which can give students a more complete and tailored experience. What it lacks is academic investment into getting more opportunity to the students, probably because they are busy on day-to-day teaching.
      It is vital for staff to engage with the students answers directly for them to get feedback on how their teaching is being received – machine marking is not a final solution but assessment at scale through technology can liberate more time with the students, whole are now more aware of the quality of their knowledge before high-stakes exams.

      Reply
  4. Shane Sutherland

    My vote is for 2 – the Feedback Hub…

    However, the development of a front-end tool is problematic because of all of the issues around contrasting workflows (with the system it is integrated into), the very broad range of use-cases and contexts, and the (always underestimated) issues of UI and UX design. I would (from a partly partisan perspective) urge you to use Jisc’s technical expertise and influence to support tool development and integration by concentrating on standards, services and sharing – rather than become tool developers yourselves.

    The development of Open APIs, using Jisc’s influence to negotiate/persuade vendors to actively participate/contribute – and all of the associated project management, evaluation, dissemination etc – would make this an interesting project to want to be involved in.

    Reply
  5. mark wetton

    My vote would be for option 2. Presenting feedback to students in a holistic way and presenting feedback in novel / visual ways (we have digital postcards pilot in our medical school) would be fantastic.
    The requirements of workflow modelling in option 1 beyond the existing high level model that MMU and JISC have developed can be quite course / institution specific, so I’m not sure how useful it would ultimately be beyond sharing/reflection.
    Great to see the initiative, thanks Lisa and colleagues!

    Reply
  6. Sheila MacNeill (@sheilmcn)

    oh difficult choice. I think immediately 3 would be useful as would 4 and 5 – and indeed all of them.

    Reply
  7. Rachel Forsyth

    Much as I would like to see more systems development of the type presented by options 1 and 4, I think the biggest impact on student and staff experience would come from Option 2, so that’s my vote. I also think this is implementable in various ways – the work done at UCL IoE on the Assessment Careers project is a great starting point.

    .

    Reply
  8. Andrew Dalby

    I think that most of the first four task can be achieved with existing tools if they are used imaginatively. For example I use a spreadsheet and mail-merge along with a standardised feedback form to manage feedback to students for electronically submitted work.

    Where we need to make steps forward is making the assessment relevant. I am moving to much more practice based assessment and this means much more case-studies and simulations and these need completely different solutions. For maths I do online tests but I need tolls that allow these to become multi-part to improve the pedagogy so that I can see what steps students are struggling with, rather than breaking problems into sub-parts and asking about them individually.

    So for me 5 is the big issue as the rest we can work around – although a way of doing all the tasks easily and not having to build work around solutions would be even better.

    Reply
  9. Peter Ayre

    Hi ,
    I did try to attend your meeting but it was too late when we found out about it.
    We at City College Norwich have developed our own system for doing this with our HE courses managing assessment setup, work submission, marking, verification and all the way through to calculating degree results. And just like any successful standard we have ended up with more than one so we manage most of our FE in a different way. This was a huge learning curve which was expanded again when we provided our system to Colchester Institute.
    In the end there really are two different areas of trouble, internal and external.
    Externally the marking and feedback requirements from our Degrees, HNDs and Access courses are all very different. For example Access courses require that we give the dates of verification to students which others did not. And many of the things that we considered normal in Norwich turned out to be quite different in Colchester. We do not usually have control over these differences so the question is about how we manage them.
    Internally there was a serious problem with different cultures that has been helped a lot by unifying the under a single system. Too many different ways to get the same result led to a varied experience for students not all of it good. Indeed a promise to the students was what kick started our changes.

    All of the options presented would be useful and from my experience read like a map of some of the problems we had to overcome. However I don’t see how you can produce item 5 without first doing the work for item 1 and I feel that collecting requirements alongside a list of resources that assist would be a good start.

    We would be more than happy to share our experiences or demonstrate our approach if that is of interest

    Reply
  10. Stuart Hepplestone

    On behalf of the Assessment Journey Programme at Sheffield Hallam, we have an interest in all of these. Number 5 – the toolkit – in particular would be beneficial from a practice perspective. Technically the requirements map, system integration and feedback hub are also important pieces of the puzzle.

    Reply
  11. Simon Davis

    I think they all sound useful to various degrees although as Terry says there are already several places to go for good quality resources and advice. Having done quite a lot of thinking around the processes I think what we’re in need of most is practical technical solutions that meet our requirements and can be deployed off the shelf or integrated into our existing platforms (standard market leading VLEs, SRS etc). With that in mind we’re definitely interested in the development of something that looks like a feedback hub (#3).

    Reply
  12. Chris Turnock

    Whilst I agree with many of the comments stating all are useful and did get to post my vote at January workshop. However my votes would be for 1 & 2 as think they may entail considerable development work the benefits to the sector would be massive.

    Reply
  13. Joe Berry

    I would vote for 1 (process mapping) and 2 (feedback hub) as priorities. Number 1 would be great in terms of smoothing the introduction of EMA, and 2 is where we start to leverage some of the deeper benefits of online making and feedback. I see these as two rungs on the same ladder. Online marking as it stands has many benefits to students and administrators, but not too many to an average academic if we are honest. Taking the next step to a ‘student journey’ view adds a tangible pedagogic benefit that would be hard to achieve without the technology.

    Options 3 and 4 are more IT systems issues which, whilst vital, should really be business as usual for the institution IT and supplier’s IT respectively. I appreciate they may need some guidance/driving from external organisation. Number 5, the toolkit, would be nice to have but may also overlap with the other areas (especially 1) so may be fed into naturally by them.

    Reply
  14. Mira Vogel

    My top vote is for 1 (process mapping) – for the simple reason that the current mismatch between the processes we have and the technologies in place is both extremely costly and a source of workplace stress for all involved. I think that making the diversity of workflows manifest is a requirement to designing technologies for them, and doing it as a sector rather than unilaterally is the surest route to getting them reflected in the technologies. It may also have the side effect of providing reasons for departments to change practices.

    My second vote is for number 2 (feedback hub) – but thanks to my colleagues including Tim Neumann and Jess Gramp, there’s a fair bit of work done already.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *