UK HE system requirements: invitation to suppliers to respond

Jisc and UCISA have been working with universities to develop a statement of requirements to help suppliers understand the needs of UK higher education in relation to the electronic management of assessment (EMA).

We have created a template for suppliers to respond to those requirements to help universities better understand the ways in which particular systems can support their needs and to see how particular combinations of systems might work together. The template is based on our assessment and feedback lifecycle.

We are now issuing an open invitation for suppliers who support the electronic management of assessment to respond and complete this template, by Friday 5th February. The downloadable template is attached to this blog post.

The template is being shared with all of the major suppliers of student record systems and learning platforms in the UK as well as suppliers of the most commonly used assessment related products. We welcome participation by any other interested suppliers. Please email your responses back to gill@aspire-edu.org. We hope the template is self-explanatory but feel free to contact us with any queries.

Responses will be published on this blog to help universities making system decisions. The full responses will be available to download and we will also compile an overview table which will pull out all supplier responses to the ‘included’ column.

Suppliers will be encouraged to update their initial responses as new product versions are released.

Download the template here:

EMA System Requirements Template published 08 Jan 2016

EMA in HE: system specification for comment

This is a draft version of content for a new Jisc guide on EMA processes and systems that will complement the existing guide. Here on the blog we have split the two to help readability.  This post should be viewed in conjunction with that on processes. The drafts will be open for feedback to help us improve the guide until 5th Jan 2016 so please do share your thoughts using the ‘comment’ function below.

Selecting EMA systems

Because the assessment and feedback lifecycle covers so many different functions most institutions need a range of systems to support all of their activities. The key areas covered by information systems are generally:

  • course and module information including assessment details
  • student records including marks, feedback and final grades
  • submission of assignments
  • marking and feedback
  • academic integrity checking
  • online testing and examinations

Integrating systems

Ideally these systems should be able to exchange data readily so that institutions can mix and match technologies based on needs, preferences and making best use of the systems they already have. Currently however, interoperability between systems remains a key problem area. The expectation is that modern IT systems should have good APIs (application programming interfaces) ie a set of routines, protocols, and tools that describe each component of the system (data or function) serving as building blocks to create a plug and play architecture. In practice though the emphasis is still very much on creating a set of interfaces to move data around between systems on a point-to-point basis. This is complex to achieve and brings with it a maintenance overhead as whenever a particular system is changed, a series of interfaces must be rewritten to update the links to all of the other systems.

The systems are not the only problem. System integration often throws up a host of issues around institutional business processes, workflows, data definitions and data quality. This is why we have tackled the two topics in tandem. You need to ensure your data and processes are not an obstacle to making best use of your existing systems or to effective implementation of new and better systems.

System requirements

Through collaborating with a working group of c.30 universities and the membership of UCISA we have identified the core requirements that UK higher education institutions have for information systems to support assessment and feedback practice.

The requirements are presented in a downloadable format that maps to the assessment and feedback lifecycle and which has supporting user stories to illustrate why the functionality is necessary. They are also viewable as embedded pop-ups as part of our EMA process maps.

Because we have concentrated on what is fundamentally important to all HEIs, all of the requirements should be considered as being ‘Must have’ priority.

Download requirements list as an Excel template

EMA System Requirements Template for supplier responses i3

See the requirements embedded in our process maps

Guidance for suppliers on using the requirements specification

The specification has been publicised via Jisc and UCISA channels and suppliers of products of relevance to the EMA lifecycle are invited to use our template to highlight which of the requirements are supported by their product. Supplier responses are published on our EMA blog and customers of those suppliers are invited to use the blog for comment and discussion. The idea is that by sharing knowledge about effective use of a particular product, or about integration between a particular set of products, we can help institutions to get the most out of their existing investments.

As a supplier we suggest you continue to:

  • consider the specification when preparing your product roadmaps
  • update your response as new versions of your product are launched
  • Respond to customer discussion on the blog so that the wider community can develop a better understanding of your product.

Guidance for universities on using the requirements specification

The requirements specification can be used as a basis for developing an ITT to select a new system for your institution. This will not only save you work; you can also have confidence that the major system suppliers will be familiar with the requirements expressed in this way so you have a better chance of getting accurate and meaningful responses.

Using this list as a starting point you can select the parts that are relevant to your particular procurement exercise and add features that are desirable for your institution as well as further detail about your existing product set that will need to interoperate with the new system.

For more guidance on how to go about choosing new technologies to meet your needs see our guide to selecting technologies. This will take you through managing a selection project, defining your requirements and conducting supplier evaluation.

EMA in HE: processes draft for comment

This is a draft version of content for a new Jisc guide on EMA processes and systems that will complement the existing guide. Here on the blog we have split the two to help readability. The drafts will be open for feedback to help us improve the guide until 5th Jan 2016, so please do share your thoughts using the ‘comment’ function below.

EMA in higher education: processes and systems 

This guide forms part of our suite of Assessment resources. It is intended to help those of you in HE review and improve the business processes that support your assessment and feedback practice and make the right choices as regards supporting information systems. We have created this guide for two main reasons:

  • Our research has shown that many institutions struggle to get the most out of their information systems due to the variability and complexity of their business processes
  • Both suppliers and institutions tells us there is a need to see core UK wide system requirements clearly expressed

What’s in this guide?

This guide contains resources to help people who manage assessment processes from both the academic and the administrative point of view, people who manage the supporting EMA systems and suppliers of EMA systems (ie systems supporting online submission, marking and feedback of assignments). It includes:

  • A simple overview of the submission, feedback and marking process concentrating on the core tasks that all institutions need to carry out
  • A set of requirements that specify what EMA systems need to be able to do to meet these core UK requirements
  • A set of prompts to help you compare your own processes to our model and analyse where improvements could be made

The guide should be used in conjunction with our description of the assessment and feedback lifecycle in the guide to Transforming assessment and feedback with technology.

What are we trying to achieve?

The purpose of this guide is to help you implement processes that meet all necessary quality standards but which are no more complex than they need to be. Principles underlying the approach are:

  • Ensuring tasks are carried out by the right people. We have identified tasks as being: a learner responsibility, an administrative matter or a task requiring academic expertise
  • Automating any routine administrative tasks that can be automated. Digital information immediately opens up possibilities for streamlining the manual workload associated with dealing with vast quantities of paper
  • Ensuring processes are undertaken for sound academic reasons. We hope you will use these resources to challenge complexity that exists for no better reason than “We’ve always done it that way”.

We can help you identify what a process needs to achieve, what skill set is needed and where information systems can help. What we can’t do is design standard workflows that will suit your institution. Individual workflows will depend on how your organisation is structured and what information systems it uses. You may well have different workflows for different assessment situations and this is fine so long as you are sure there is a valid reason why each variant exists.


What are the common problems?

Variation in approach across the institution

Research for our 2014 EMA landscape review[1] showed that responsibility for assessment and feedback policy and procedure is often devolved to local level within institutions. What this means in practice is that large institutions rarely carry out a particular function by means of a single, institution-wide, business process. Different faculties, schools, departments or even programmes, each have their own ways of doing things. This level of process variation is an inhibitor to achieving the efficiencies and benefits possible through the application of EMA technology because a series of time-consuming and cumbersome workarounds are likely to be needed to adapt the system to many different ways of carrying out the same activity.

Technology ‘bolted on’

Participants in our research frequently commented on the extent to which new technologies are ‘bolted on’ to old processes without people really taking the time to stand back and consider what the process is really intended to achieve.

In some cases poor process design is due to lack of time and appropriate skills. During the Jisc assessment and feedback programme, a concern was voiced that academic staff often find themselves on a ‘treadmill’ due to poorly designed processes. Their workload is such that they cannot pause to think about doing things differently. They also recognise that they do not have the skills to undertake process review and redesign without some more specialist support yet they know that they cannot improve their pedagogy without better designed processes.

Organisational myths

In other cases a significant part of the problem is the persistence of ‘organisational myths’ surrounding policy and process. The tendency to do things the way they have always been done is perpetuated by a belief that this is somehow enshrined in local or institutional policy. When challenged on existing approaches, academics are often surprised to find that many characteristics of the process are matters of historic choice rather than regulatory issues and, indeed, often surprised at how few regulations there actually are or how easy it is to make changes to perceived blocks and barriers in the regulatory frameworks. Variation in the application of assessment policy across an institution is often down to such myths about what actually constitutes the policy in the first place.

Example: Impact on student experience

Different approaches to carrying out the same task impact not only staff workload but also the student experience. As an example one institution that had the capacity to accept e-submission of all assignments based on the written word noted the following variations:

  • One faculty accepted e-submission for postgraduates only but then printed out the assignments for marking
  • Some course teams were happy to accept and mark submissions electronically but students were still required to submit a paper copy to meet the requirements of the coursework receipting system
  • One department required students to submit a hard copy for marking and also an electronic copy to be submitted through the plagiarism detection system

Simple EMA process overview: submission, feedback and marking in 10 steps

Do we really believe you can ever reduce this complexity down to 10 steps? That’s the challenge for you. We aim to show that there are 10 core tasks in the process and that by making good use of EMA systems you can cut out much of the manual intervention that consumes resource, introduces error and adds little value to the learning experience.

Shown below is our overview of the submission, marking and feedback process. Click on the image for an enlarged view.

Submission, marking and feedback process i2 - New Page

his model covers all types of summative assessment where there is a mark given as well as feedback. It also covers iterative processes where students might undertake formative checking of their own work using text matching tools to review academic integrity, might undertake self or peer review and might be required to show evidence that they have engaged with their feedback before their mark is released.

Why are there 11 task boxes rather than 10? Call it cheating but we decided that ‘Apply penalty or mitigation’ is single task but might occur at different times depending for example on whether the penalty was for late submission or academic misconduct.

This is a high level overview. There is no right or wrong way to draw a process map – you need to choose the level that is right for you. For example you might choose to break down some of the sub processes further. Find out more in our guide on process mapping.

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit

Submission process detailed view

Below is an example of the submission phase broken down into more detail. This time we have identified the role of the EMA system to show which tasks can be fully automated. If you follow the link to the interactive version and hover over the question marks you will see hints and tips about managing the task.

http://ovod.net/wilbert/sundry/ema/submissionSwimLaneModded.html#

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit

Marking and feedback process detailed view

N.B. Interactive view similar to the above to be added

N.B. The guide should ultimately offer options to download as an interactive PDF and as a Visio file to edit Marking and feedback process i2

Improving EMA processes

We suggest you use our 10 step model (and the detailed breakdowns) to compare against your own practice and ask yourself the following questions:

  • Are you doing additional tasks – if so, why?
  • Are the tasks being done by the right people eg do you have academic staff undertaking administrative duties that do not require academic judgement?
  • Do you have systems that could carry out some of the tasks you are doing manually?
  • Do you have multiple ways of performing the same task – if so, why?

You can find out more about ways to help improve your processes in our guide to process improvement.

Our guidance on process mapping includes some simple and user friendly techniques that can be used with all types of stakeholders to analyse issues with current approaches and suggest and evaluate ideas for change. In particular some of the techniques you might consider are: flow charts with swim lanes, rich pictures and RAEW (Responsibility, Authority, Expertise and Work) analysis and details on all of these can be found at the end of the process mapping guide.

You can also find out about how some universities have benefited from so-called Lean approaches at the end of our section looking at what is a process?

Examples from across the sector

Here are some further example of how different universities have approached the mapping of assessment and feedback processes to help them review practice:

N.B. Examples will be linked from the guide.

  • The University of Oxford, department of continuing education used a swim lane approach with all actors identified developed in Excel. CASCADE workflows
  • The University of Hull has created a more detailed version of our overview with its individual systems identified. Hull Ideal process
  • The University of Sheffield, department of town and regional planning, used before and after process maps to review assignment submission. This is an example where the department identified a continued need to support marking on paper. Sheffield TRP Assignment Submission Process    Sheffield TRP Assignment Submission Process – To Be
  • University X has used our lifecycle to produce a roadmap for an ambitious development programme over a three year period.

 

[1] Ferrell, G. (2014) Electronic Management of Assessment: a landscape review

EMA Self-assessment tool update

Work is progressing well on a diagnostic tool that will help you assess the state of progress with electronic management of assessment in your area of work. At the end of the assessment you will be able to able to estimate your current state of maturity and develop an action plan.

The tool uses five stages of maturity:

Researching You are at an early stage of EMA. You do not seem to have a comprehensive view of organisational activity overall; policy, process and systems seem fragmented. Ensure you have senior management support to undertake further investigation. Start by defining the principles that underpin assessment and feedback in your organisation and find the areas of good practice you can build on.
Exploring You are probably aware of pockets of good practice but have not really begun to try to scale this up. You will need to be clear about expected benefits in order to effect the cultural change needed.
Embedding You are at a tipping point where fairly widespread experimentation is close to becoming mainstream practice. A key issue will be ensuring that business processes are sufficiently consistent to support a more holistic approach.
Enhancing You are probably already supporting the core of the assessment and feedback life cycle with technology and looking to fill gaps and find more elegant solutions to existing workarounds.
Pioneering You are looking to go beyond automation, standardisation and efficiency gains to ensuring that EMA has a truly transformative impact on learning and teaching in your organisation. Your organisation is probably a provider of many of the resources in our toolkit but we hope we can still provide some inspiration and support.

The tool has been tested with a pilot group of institutions. As a result of their feedback we are making improvements so that the tool better meets the needs of those working at a more localised level than whole institutional change.

We hope to be launching the final version early in the New Year so keep an eye out for more updates.

The self-assessment tool is being developed using the Jisc co-design approach and we are particularly grateful for participation from the following institutions:

Anglia Ruskin University
Aston University
Birmingham City University
Manchester Metropolitan University
Plymouth University
University of Bradford
University of Edinburgh
University of Hull
University of Nottingham
University of Sheffield
University of Southampton
University of York

Outcomes of the feedback hub solution development

Moving the management of assessment online has opened up new possibilities and expectations around the role of assessment in learning. One of these is the use of holistic feedback; feedback that comes from across the years of a degree programme, and not just the modules a learner is enrolled in at the moment.

History: the solution workshops

You could call a system feature that offers such functionality a ‘feedback hub’, and when Jisc worked with the sector to identify unmet needs in the EMA area, it was clear that this was one of them. With a feedback hub, it becomes possible for students and their tutors and other educators to identify strengths and weaknesses in their learning and focus feedback on them over more than one assessment cycle.

The features of feedback hubs

In order to provide holistic feedback, feedback hubs ought to have a particular set of features. Over the course of two workshops, colleagues from a range of providers defined a set of those. Following a further round of whittling and prioritising, the following ten features emerged, ranging from crucial to nice to have:

  • aggregate feedback from all modules/years and systems
  • aggregate grades as well as feedback
  • provide different views for staff and students
  • facilitate dialogue about the feedback
  • facilitate dialogue about the feedback in stages (after the student has responded to questions, for example)
  • provide different views structured by different programme wide learning outcomes
  • add feedback manually at any time
  • provide access to comprehensive feedback offline
  • allow students to rate feedback
  • integrate with learning analytics solutions

Of the first defining feature, the need to aggregate feedback from across different systems, is at least as important as the ability to aggregate over time, and probably more difficult. Some institutions manage to do all assessment in one system, but the majority use at least a VLE and an assessment service such as Turnitin, and usually a student record system as well. Some also have an assessment management system such as co-tutor in the mix.

Current state of development

Feedback hubs are a relatively new phenomenon, and, as explored in an earlier post, there’s quite a variety of systems that provide the functionality as a result. Progress has been remarkable in the course of a year, and users of systems such as the MyFeedback Moodle plugin, Pebblepad’s Flourish or the Brightspace VLE can start developing holistic feedback practice this academic year.

Others are not far behind, though much still depends on system integration. Fortunately, the IMS Assignment work that was also reported on earlier should help make it much easier to exchange assignments and information about assignments.

The potentially trickier question is how to develop practices in using feedback hubs as part of business-as-usual assessment processes. The issue is a classic Catch 22: vendors would like to know what users really use, but the users don’t know what they need until they’ve tried it. Nonetheless, with the first institutions going live this year and the next, we should be able to learn from practical experience soon.

Possible interventions

We explored in detail the potential options for Jisc to add value in this
space. These can be summarised as:

  • Addressing the ease of access to feedback. The main technical barrier identified by vendors and institutions alike for the provision of holistic feedback was the difficulty of gathering assignments and feedback from the range of systems where feedback is typically held. To this end, one option would be to work with the recently set up IMS LTI Assignment Taskforce, which is working to develop a common specification that should ease the movement of assignments or information about assignments from one system to another.
  • Partnering with existing vendors in the development of software, or exploring the development of new tools
  • Supporting a forum where specialists from learning providers and vendors could come together to share early priorities and findings to guide software development at first, and good practice later. Such a forum could facilitate the development of software and practice in tandem, and could not only break through the requirements catch 22 by speeding up the development of common holistic feedback practice, but also provide a means of testing software and software integrations in a variety of contexts
  • Finally, ways of bootstrapping feedback hub functionality by organising and co-sponsoring crowd funded features were also explored.

Outcome

After a full review of the options, it was agreed to focus on option 1 – to address the key issue of providing easier access to feedback from across systems. We will look at the models of marking that have already been developed, and will specify them at a detailed data model and protocol level. These will be available to anyone to take up and use and shared with the IMS LTI Assignment task force. That means that assessment workflows that are common in the UK will be used to inform the design of the protocol that systems such as VLEs and assessment services use to integrate with each other. At the same time, the vendors in the IMS group will get a detailed and relevant view of real-life use cases to guide the design of the new specification.

What this also means is that assessment processes can be examined from the high level of the assessment life cycle, to the 10 step EMA process, to the particular workflows of the 5 models of marking, to data model and protocol level choreographies of which bits of data get exchanged between systems and in what order. We’ll have to see how many of the marking models can be covered in this way, because each of them do sum up quite a lot of variability. The aim of the detailed models will be to inform not just the IMS work, but also system developers as well as those who need to configure, integrate and maintain these system in institutions.

In addition, we will take the feedback hub features identified above and bundle them into a general EMA system requirements template that vendors can use to describe the capabilities of their products (which is already in train as part of the wider EMA project). That way, we can surface how systems are currently meeting these demands for a holistic view of feedback in the context of wider EMA capabilities.

Due to the rapid development of feedback hub software, the development of new tools, or working in partnership with existing vendors on the development of tools weren’t seen as viable options. It was also felt that the development of a community of practice, although potentially useful, wasn’t something that Jisc is currently best placed to take forward.

We are very much looking forward to seeing how the development and use of these tools in universities and colleges emerge and grow over the coming months and years, please get in touch to share your stories.

List of hubs

As part of the feedback hub research, we’ve had a look at the following systems:

D2L Brightspace
Instructure Canvas
MyFeedback Moodle plugin
University of York BB plugin
The Bedford College Moodle Grade Tracker
TurnItIn
PebblePad Flourish
Co-Tutor
Edinburgh College of Art assessment and feedback system
University of Portsmouth Moodle – TII integration
Anglia Ruskin University Sharepoint – TII integration

We’re particularly grateful to these system providers and the wider community for their contributions to this research.

Online resource now available ‘Transforming assessment and feedback with technology’

As you all know we’ve been working on the development of an online toolkit to bring together examples, guidance and resources to support universities and colleges with their use of technology for assessment and feedback. We’re pleased to say that the first element of that suite of resources is now available, the ‘Transforming assessment and feedback with technology‘ online guide, with other elements due for launch in March 2016.

We’re keen to gain your thoughts on this resource, and have created a short googleform to capture these. So please do have a browse, and tell us what you think! This feedback will help to inform the final shape of the resource (the form will be open till the end of December).

The full suite of resources when launched fully in March will also include:

  • an EMA self-assessment tool (a development requested by the working group) leading to resources within the guide (currently being tested);
  • and also the EMA workflows / systems specification resources embedded within the guide (see previous blog post on where we are up to with those).

 

Our 10 step EMA process

This post is intended to support an activity during the Jisc Learning and Teaching Experts Practice Group meeting on 14th October.

One of the aims of our EMA work is to help you implement processes that meet all necessary quality standards but which are no more complex than they need to be. Principles underlying the approach are:

  • Ensuring tasks are carried out by the right people. We have identified tasks as being: a learner responsibility, an administrative matter or a task requiring academic expertise
  • Automating any routine administrative tasks that can be automated. Digital information immediately opens up possibilities for streamlining the manual workload associated with dealing with vast quantities of paper
  • Ensuring processes are undertaken for sound academic reasons. We hope you will use the resources we produce to challenge complexity that exists for no better reason than “We’ve always done it that way”

Our resources are intended to help you identify what a process needs to achieve, what skill set is needed and where information systems can help. What we can’t do is design standard workflows that will suit your institution. Individual workflows will depend on how your organisation is structured and what information systems it uses. You may well have different workflows for different assessment situations and this is fine so long as you are sure there is a valid reason why each variant exists.

Shown below is our 10 step overview of the submission, marking and feedback process. Click on the image to enlarge it.

Submission, marking and feedback process i2 - New Page

This model covers all types of summative assessment where there is a mark given as well as feedback. It also covers iterative processes where students might undertake formative checking of their own work using text matching tools to review academic integrity, might undertake self or peer review and might be required to show evidence that they have engaged with their feedback before their mark is released.

Why are there 11 task boxes rather than 10? Call it cheating but we decided that ‘Apply penalty or mitigation’ is single task but might occur at different times depending for example on whether the penalty was for late submission or academic misconduct.

This is a high level overview. There is no right or wrong way to draw a process map – you need to choose the level that is right for you. For example you might choose to break down some of the sub processes further.

At the moment we are working on how to present process maps and associated system requirements in a way that helps both institutions and suppliers.

Below is an example of the submission phase broken down into more detail. This time we have identified the role of the EMA system to show which tasks can be fully automated. Hovering over a box will show the system requirements associated with the task and clicking a question mark will highlight common issues for institutions.

We suggest you use this model to compare against your own practice and ask yourself the following questions:

  • Are you doing additional tasks – if so, why?
  • Are the tasks being done by the right people eg do you have academic staff undertaking administrative duties that do not require academic judgement?
  • Do you have systems that could carry out some of the tasks you are doing manually?
  • Do you have multiple ways of performing the same task – if so, why?

We invite the Jisc Learning and Teaching Experts Practice Group to provide feedback on how the models are presented. There are currently 3 options:

  1. By clicking on the image to go to an interactive diagram in a separate window.

assignment submission swimlanes

2. A link to the image in an interactive diagramming tool.

https://www.lucidchart.com/documents/view/20f69d27-9b02-42ad-8bf7-e49b7bc083f4

3. A link to an interactive PDF. We think this works very well in Adobe but may render less well if you are using a different PDF reader.

Submission swim lanes i4

 

 

 

 

Naming the ‘EMA toolkit’

We’re starting to think about a name for our EMA toolkit, which will be the home for many of the outputs of this project (see previous blog post ‘Solution development is underway‘). So, as a bit of August Bank Holiday fun, if you have any suggestions please do add them into the mix – we’ve created a googledoc to collate any thoughts at: http://bit.ly/1NDZt12. By Tuesday 1st Sept noon please!

Vendors start work on assignment interoperability in IMS

For the past couple of months, vendors such as TurnItIn, Blackboard and Elucian, as well as a number of UK Universities, have been working on a solution for assignment interoperability in the IMS Assignment Task Force.
This is highly relevant for the Jisc’s Electronic Management of Assessment (EMA) project, because it holds the promise of easing the movement of assignments or information about assignments from one system to another. Making assignments available for marking in a VLE while they’ve been submitted to a plagiarism service, or showing a student feedback and marks from across a degree programme and a variety of systems have all been flagged by project participants as things that should be easier.
Because the IMS group is in the very early stages of setting an interoperability specification that could enable such data exchange, the precise scope of the work has not been set yet. However, because a lot of the basic parts for a solution are already around in IMS, developing a solution could happen fairly quickly once the scope has been agreed. Relevant parts include the widely implemented IMS Learning Tools Interoperability specification and the closely related, box fresh Content Item Message specification.
Because of the relevance of the IMS work to the EMA project, many of the UK Further and Higher Education sector’s requirements are being fed straight into the work. This is helped considerably by the fact that developers from Exeter, Anglia Ruskin, Edge Hill and Portsmouth universities are directly involved in the IMS work.

Developing assessment practice with holistic feedback

One of the outcomes of the EMA co-design project is the interest in holistic feedback on assessment. It’s clear to a lot of people in the sector that it’d be beneficial for
learners, markers, tutors and others if assessment took in a complete view of a learner’s journey, and if feedback engaged the learner better, and also went further back as well as forward in time. We’re now exploring how we can bring that ideal a little closer for most in the sector.

Having examined various feedback hub types, it’s clear that they are a promising, but also a relatively new technology category. For that reason, Jisc are exploring how we can help support the take up and use of feedback hub technologies, and we have decided to focus on two broad types in the first instance.

One type is the VLE plugin feedback hub; extensions to common VLEs such as Moodle and Blackboard that show feedback from across different modules and years. These have the advantage that they integrate where much of the feedback data often already is, and it’s in a place where a lot of people look first. Some plugin development is already under way, but the challenge is to make them work with a variety of VLE, student record system and assessment service combinations.

The other type is the assessment service, some of which have holistic feedback features. These services are web based systems that offer a variety of marking and plagiarism checking features. These assessment services have the advantage that they don’t require local installation, and integrate with a variety of web and mobile platforms, not just a VLE. Here too, though, the challenge is to make integration with other assessment data sources easy.

Fortunately, for the long term, work has started within IMS to address the need to move assignments between various systems, and we’re participating in it. Custom integrations could work on the shorter term for that purpose, and standard solutions already exist for moving marks around.

Over the next few weeks, we’ll explore how we can help take the development of these feedback hub systems forward, and how we could start a development phase where more FE and HE providers can try out the systems and develop practice. We’ll update once that exploration has completed.

Stay tuned!