The Electronic Management of Assessment (EMA) project is a Jisc project which is working with partners (the Heads of e-Learning Forum (HeLF) and the Universities and Colleges Information Systems Association (UCISA) to support institutions with the electronic management of assessment. The term EMA is increasingly being used to describe the way that technology can be used to support the management of the assessment and feedback lifecycle, including the electronic submission of assignments, e-marking and e-feedback. This project seeks to maximise the benefits technology can offer.
Digital Assessment: current and future developments across europe
The webinar took place on 25th April 2017. It was organised by EUNIS in collaboration with JIsc.
Topics covered included national initiatives in Norway and the Netherlands and the experiences of the University of Bergen (using Canvas for formative assessment and Inspera for digital exams).
The webinar recording can be found here:
https://ca-sas.bbcollab.com/mr.jnlp?suid=M.5F189A13EE7B62C8F18F972018C873&sid=2009077
Follow the link to download the recording. Once it is downloaded you may need to select Playback> Player>Play from the menu bar.
Use the slider underneath the speaker’s image to adjust the volume.
The webinar is recorded using Blackboard Collaborate which requires java to run so it cannot be played on a chromebook.A PDF copy of the slides is attached for those who cannot access the recording. Dig Ass Webinar April 2017 All V1
Below you can find links to further information provided by the speakers.
Freddy (Norway national initiatives)
Links to Digital Assessment documents:
These are all documents that are a result of collaboration within Norwegian Higher education sector, during the National project.
They are translated to English with support of GEANT https://services.geant.net/sites/cbp/Pages/Home.aspx
CBP 44: Digital assessment process & IT architecture: https://services.geant.net/sites/cbp/Knowledge_Base/Campus_Networking/Documents/CBP-44_ICT-architecture-for-digital-assessment_published.pdf
CBP 42: Physical infrastructure for digital assessment: https://services.geant.net/sites/cbp/Knowledge_Base/Campus_Networking/Documents/cbp-42_physical-infrastructure-for-digital-assessment.pdf
CBP 43: Clients for digital assessment: https://services.geant.net/sites/cbp/Knowledge_Base/Campus_Networking/Documents/cbp-44_clients-for-digital-assessment.pdf
CBP 45: Logging and monitoring of digital assessment https://services.geant.net/sites/cbp/Knowledge_Base/Campus_Networking/Documents/CBP-45_Logging-and-monitoring-for-digital-assessment_published.pdf
Forthcoming:
CBP xx: Integration for digital assessment
CBP xx: Legal issues regarding use of cloud services
Here´s some information provided by Ingrid Melje at UNINETT about the selection of vendors:
Newsletter – Digital assessment leap: Norwegian research and higher education sector concludes on three ICT systems https://www.uninett.no/en/digtal-assessment-three-vendors-chosen
Requirements specification: https://www.uninett.no/sites/default/files/portal_docs/requirement_specification_digital_assessement.pdf
Vendor questions:
Robert (University of Bergen)
A working group (consisting of faculty, administrative and students) at University of Bergen (UiB) Faculty of Science and Mathematics have written a report titled “Digital education and assessment in Mathematics and Natural Science at UiB” based on their experiences using Canvas (Mitt UiB) and Inspera Assessment and more. Available here: https://mitt.uib.no/courses/2816/files/347048?module_item_id=27900
(originally written in Norwegian and was outsourced for translation, pardon some poor translations)
Systems that integrate with Canvas https://community.canvaslms.com/community/answers/partnerships#Alliance
Annette (Netherlands national initiatives)
Thematic Issue Innovations in digital assessment https://www.surf.nl/en/knowledge-base/2016/thematic-issue-innovations-in-digital-assessment.html
Secure assessment workbook: www.surf.nl/secure-assessment-workbook
Whitepaper Online proctoring: https://www.surf.nl/en/knowledge-base/2016/white-paper-online-proctoring.html
Guidelines for digital assessment policy – https://www.surf.nl/en/knowledge-base/2016/guidelines-for-digital-assessment-policy.html
Further information:
Andrew Fluck: Participants may be interested in the eExam symposium which will be held as part of WCCE-2017 in Dublin 3-6 Jul17. More info at WCCE2017.com
Lisa Gray, Jisc, UK: If anyone is interested in our previous work exploring the functionality of a variety of assessment systems (including VLE/LMSs) we have a summary of supplier responses against key requirements (updated August 16) which can be found on the EMA blog at: https://ema.jiscinvolve.org/wp/2016/02/24/supplier-responses-to-uk-he-ema-system-requirements/
The journey of an assessment
An assessment passes through many hands and even more systems on its way from setting to recording the final mark. Partially because it is so central to teaching and learning, it’s a process that seems unusually complex, and involving many touch points in its electronic form. We’ve sequenced the whole trajectory of a typical assessment to see what system integration lessons can be learned.
A well known rule of thumb in process or architecture modelling is to limit diagrams to seven blobs, plus or minus three. The Jisc Electronic Management of Assessment (EMA) ten step model sticks to that adage to great effect. The UML sequence of an assessment’s journey discussed here does not- not even when chopped into four phases.
The sequence diagrams are, therefore, meant more as a means of surfacing common bottlenecks and other data flow issues rather than a means of elucidating the EMA landscape. As such, it works well, but some limitations need to be borne in mind.
One limitation is that the sequence modelled is fairly typical, but cannot be representative of all EMA processes, as there is too much variation in practice. For example, the modelled sequence does not include peer review, which would make the journey more complex still. It also doesn’t include anonymous marking, though that wouldn’t necessarily affect the flow much. The assessment is single part, and the assumption has been that people mark online. The sequence focusses on system interactions, so actions like logging in have not been included.
Crucially, the sequence assumes a fairly typical use of a combination of a VLE such as Moodle and an assessment service such as Turnitin, rather than the use of either one in isolation.
The resulting sequence has been broken into four stages:
- setting and editing the assessment
- doing the assignment iteratively, with feedback
- double marking
- marks and feedback release, exam board moderation
Looking across them all, a couple of issues become clear. One is to do with the synchronisation of information across systems, the other is rooted in the still imperfect combination of desktop and online systems.
The VLE as the centre of the teaching and learning universe
The assessment journey outlined here is based on the current practice of VLE and assessment service integration via VLE plugins. What’s noticeable about that practice is the fact that all interactions start at the VLE, even if the main business is at the assessment service. This is evident in many little two step hops:
In order to select an assignment, the user first needs to call up a page in the VLE about the assessment ‘y’, the VLE then grabs the relevant list ‘y’ from the assessment service, and that gets shown in the user’s browser. The reason why this hop is needed becomes clear at release points:
An edit to the assessment configuration such as changing the release date needs to be synchronised between the VLE and the assessment service. One way of doing that is to make sure any interaction is initiated from the VLE, which the assessment service always follows.
The consequence of such a master-slave relation is that it entrenches the status of the VLE as the centre of the teaching and learning universe, which may or may not be the aim of a particular organisation.
It is possible to change all that with a different protocol where changes can be initiated from either end, like so:
The disadvantage of that solution is that it makes considerably bigger demands on the integration part of both systems. Each has to be ready to receive a change from the other at all times, and able to act upon it. There can’t be dropped messages, and duplicated messages can be a problem too. In case of doubt, there has to be a way of determining which system is right. If they can’t do this, there is a risk of assessment details such as deadlines getting out of sync.
Still, a desire to re-align wider institutional learning environments may push us in the direction of system integration protocols that are more two-way.
Desktop and online integration and the submission bottleneck
Another aspect that the sequence diagrams make clear is the complexity introduced by moving data from the internet to the desktop and back again. Some of that is clear in the final ‘release feedback’ view, particularly where marks and assignments need to be shuffled to and from external examiners via email.
But the desktop obstacle is particularly noticeable in the ‘doing assessment’ view where a good deal of the interactions are all about navigating from the VLE to the assessment service interface, then picking the assignment, and uploading it, before getting a receipt. And that’s before considering the fact that assignment(x) relates to multiple separate files.
What’s worse, the assignment submission dance is very time critical because of assessment deadlines, which, worse still, all tend to fall at around the same time across the whole country. The resulting load on the assessment service can easily lead to significant performance issues at the worst possible time.
This suggests that eliminating the desktop part of the sequence, and keeping the whole operation online could streamline the process. Current VLE and assessment service integrations with online authoring environments such as MS Office 356 or Google Docs do not really achieve that, however. The reason is that they still treat integrations as a series of user driven events in which the desktop file system is simply substituted with an online one. No real use is made of the fact that online authoring environments are persistently available to the VLE and the assessment service.
If they did make use of that persistence, the VLE and the assessment service could access the assignment from initial conception to exam board moderation, and maybe even beyond, for any purpose at any time without relying on a user. Formative feedback could be given at any stage, and originality checked whenever that function is ready, thus avoiding the submission bottleneck. The persistence of online document storage could also mean that accessing feedback on assignments from different modules and different years (i.e. holistic feedback) becomes much easier. Because neither students nor teachers need to drive the assessment flow by stepping through the submission process repeatedly, the load on them is lightened, and the scope for error reduced.
Systems aren’t quite ready for such an architecture, but it’s easy enough to set up such a process in your favourite online authoring environment providing you’re willing to forego the assessment service functionality.
Next steps
Because the assessment process modelled here is generic, particular institutions might want to tweak the general flow for their specific flow in order to spot bottlenecks and unneeded complexity that are particular to their processes. For that reason, the UML sequence diagrams are made available in editable form. Fixes and refinements would be very welcome too!
From Jisc’s perspective, one interesting possibility is to look at the steps in the sequence to identify those interactions that we might want to capture in learning analytics data streams. Given how critical assessment is to student success, some of the EMA process steps ought to be valuable raw data for predicting the future performance of a student. An obvious example is how close a student submits an assignment to a deadline, but we can also look at evidence of interaction of a student with feedback on assignments.
Finally, it is striking how an examination of relatively low level system integration protocols can point to much larger questions that could do with further exploration: whether the VLE really should continue to be cemented as the central coordination point in teaching and learning, and whether it is possible and desirable to set up an online assessment process that is centred on incremental authoring rather than file submissions and deadlines.
Downloadable files
The whole process in four stages;
As a PDF
An OmniGraffle document
A Visio document
Online exams webinar recording now available
The recording of our webinar ‘Online exams: migration or transformation’ run jointly between Jisc and EUNIS http://www.eunis.org/ is now available.
Listen to the recording on our YouTube site https://www.youtube.com/watch?v=MjMNanbMpMQ to hear the views of:
- Stuart Allan, who recently completed an MSc on the subject at the University of Edinburgh
- David Parmentier, Sogn og Fjordane University College, Norway
- Annette Peet, SURF , Netherlands
- Martyn Roads, consultant specialising in assessment in the UK FE and skills sector
You can find out more about the background to the webinar in this blog post https://ema.jiscinvolve.org/wp/2016/08/01/online-exams-migration-or-transformation/
The session was held using Blackboard Collaborate and the recording is made available on YouTube to maximise its accessibility. This means you don’t have access to the chat that was taking place so here I have jotted down a few comments and observations from the chat as well as links to some of the key resources suggested by the presenters.
If you would like to access the full version recorded in Blackboard Collaborate you can find it here: https://ca-sas.bbcollab.com/site/external/recording/playback/link/meeting.jnlp?suid=M.12D0DC0F57D3390C63CFE5388C03F8&sid=2009077
We were using the term ‘online’ in a broad sense: really this was all about summative testing in digital format not necessarily implying that there needs to be full Internet access at all times.
‘Pedagogic’ reasons for moving to digital exams include enhancing the curriculum and better preparing students for employment.
People are often worried about the cost of migrating to digital without having any real idea of costs/benchmarks for existing paper processes.
Interesting point about how students think when writing vs typing. Mogey and Fluck (2015) found that in computer-based exams students are sometimes more concerned with maximising the number of words they can include in their response than with the construction of academic arguments. Mogey N. and Fluck A. (2015) ‘Factors influencing student preference when comparing handwriting and typing for essay style examinations’, British Journal of Educational Technology, 46 (4), pp. 793–802
In relation to collaboration on item banks – technical incompatibility between systems seems to be an issue. Manchester University’s UKCDR study looked at this – medical education boards share successfully across institutions. If the on-screen exam/ item building system is QTI compliant, that helps with migration of items and exam templates. https://www.imsglobal.org/question/index.html There was also a comment that the education sector needs to look beyond QTI and more at API based content integration.
UK-based awarding bodies and commercial education companies in professional education outside of Ofqual’s regulation have deployed onboard remote invigilation. Service support and cultural acceptability (especially if the remote invigilation provider is non-UK based) are proving the main barriers to adoption.
‘Surely if students are taught in the same method of delivery as online assessment then they would be more in tune.’
‘The digital literacy of staff is something we are also experiencing as a challenge. The students are way ahead of anyone, so not a problem. Assessors seem to be the least prepared for an entirely digital exam.’
English government policy has retrenched to terminal summative assessment. Scottish government keen to use digital evidence in qualifications. http://www.gov.scot/Resource/0050/00505855.pdf
Resources from the presenters
You can find Stuarts blog at: https://stuartallanblog.wordpress.com/dissertation-abstract/
Here’s link to Myyry’s and Joutsenvirta’s publication as recommended by Stuart: http://alh.sagepub.com/content/early/2015/03/24/1469787415574053.full.pdf
White Paper on online proctoring (in English)
Innovation in digital assessment: assessment themed issue (April 2016 in English)
Assessment Security Selection Model (in English)
https://www.surf.nl/en/knowledge-base/2016/assessment-security-selection-model.html
Transforming assessment and feedback with technology guide: http://ji.sc/transforming-assessment-feedback-guide
EMA processes and system guide: http://ji.sc/ema-processes-systems-guide
Supplier responses to system requirements: http://ji.sc/supplier-responses-ema
EMA readiness tool: http://ji.sc/emaready
2012 UK Landscape report: bit.ly/jisc-ema
Resources suggested by participants
Surpass Paper+ assessment system as a means from migrating from paper to digital exams https://www.youtube.com/watch?v=ENAmkPWVN6k&feature=em-subs_digest
Case study on Saxion University, Netherlands (who used Surpass) and their implementation of e-assessment to deliver over 50k exams last year http://www.btl.com/case-studies/
GeoGebra for digitising mathematical equations, drawings etc https://www.geogebra.org/
Unidoodle for allowing students to submit sketch style answers http://www.unidoodle.com/
Jisc guide to digitising learning and teaching materials sustainably https://www.jisc.ac.uk/guides/digitising-your-collections-sustainably
Resources from the e-assessment in mathematical sciences conference at Newcastle University, September 2016 http://eams.ncl.ac.uk/
KioWare used in the financial services education field for browser lockdown http://www.kioware.com/kwb.aspx
Systems produced by Inspera and Uniwise were mentioned and further information about these can be found in the Jisc overview of systems supporting electronic management of assessment https://ema.jiscinvolve.org/wp/2016/02/24/supplier-responses-to-uk-he-ema-system-requirements/
Case studies on the e-assessment association website http://www.e-assessment.com/
Updated product comparisons
It’s a couple of years now since we had a flurry of activity around people sharing and discussing product comparisons (search on product features/comparisons if you want to look at the archives).
Since then we have published a set of supplier responses to the sector-wide specification (spring 2016) and this was updated in September 2016. As many Jisc and other resources link to the original blog post on this topic we have decided it is logical to keep adding updates at the bottom of that page. To see the original and updated supplier responses go to https://ema.jiscinvolve.org/wp/2016/02/24/supplier-responses-to-uk-he-ema-system-requirements/
As the products are continuing to evolve, your views on how well various combinations meet your needs are still of interest to others so please keep sharing. For those using Moodle and Turnitin the comparison on the UCL wiki is regularly updated https://wiki.ucl.ac.uk/display/MoodleResourceCentre/Comparing+Moodle+Assignment+and+Turnitin
Online exams: sign up for our free webinar
Following our guest blog post where Stuart Allan talked about his research into online exams we are running a webinar on the topic in collaboration with EUNIS (European University Information Systems organisation).
The webinar will take place on Wednesday 21st September at 12.00 UK time. Book your place here: https://www.jisc.ac.uk/events/online-exams-migration-or-transformation-21-sep-2016
Evaluating your readiness for electronic management of assessment: using our self assessment tool
Following pilot testing by 12 institutions we are offering a wider range of people the opportunity to try out our self-assessment tool. The tool is still in beta version so we know there will be functionality you would like to see added. The tool has so far been tested with universities so we are particularly interested in feedback from the FE and Skills sector as to whether adaptations are necessary to meet your needs.
The tool can be found at: http://ji.sc/emaready Please read these guidance notes before using it.
You can give feedback by using the comment functionality on this blog or by contacting lisa.gray@jisc.ac.uk
About the tool
The self-assessment is designed to support institution-wide development of electronic management of assessment (EMA). It assumes that any college or university will benefit from a strategic approach to managing assessment and feedback (and hence EMA). Such an approach requires a good overview of practice and a clear set of policies, regardless of the extent to which institutional policy leans towards promoting conformity or supporting diversity of practice.
At the end of the assessment you will be able to able to estimate your current maturity and develop an action plan. Our accompanying guide, Transforming assessment and feedback with technology provides ideas and resources to help you enhance the entire assessment and feedback lifecycle.
There are five stages:
Researching | You are at an early stage of EMA. You do not seem to have a comprehensive view of organisational activity overall; policy, process and systems seem fragmented. Ensure you have senior management support to undertake further investigation. Start by defining the principles that underpin assessment and feedback in your organisation and find the areas of good practice you can build on. |
Exploring | You are probably aware of pockets of good practice but have not really begun to try to scale this up. You will need to be clear about expected benefits in order to effect the cultural change needed. |
Embedding | You are at a tipping point where fairly widespread experimentation is close to becoming mainstream practice. A key issue will be ensuring that business processes are sufficiently consistent to support a more holistic approach. |
Enhancing | You are probably already supporting the core of the assessment and feedback life cycle with technology. You are looking to fill gaps and find more elegant solutions to existing workarounds. |
Pioneering | You are looking to go beyond automation, standardisation and efficiency gains to ensuring that EMA has a truly transformative impact on learning and teaching in your organisation. Your organisation is probably a provider of many of the resources in our accompanying guide but we can still provide some inspiration and support. |
Why do the self-assessment?
For those in the early stages of EMA adoption the self-assessment will serve as a development tool and point you to resources that may be useful. For those further down the line you may find that your experience to date lies in particular aspects of EMA and the self-assessment gives a more rounded view of where you might benefit.
For those at a more advanced stage it may serve as reassurance that what you are doing is regarded as good practice but there may still be some areas where there are opportunities to do things differently.
Who should do the self-assessment?
The self-assessment aims to support whole institution development while recognising that different parts of the institution may be at different starting points and working towards different goals. It can therefore be completed from different perspectives:
- A central support team may complete the evaluation from a whole institution perspective
- Individual departments, schools or faculties may respond in the way that reflects their own practice
- Course or programme teams may use the tool to give an even more localised view of practice
Completing the questions
Because the self-assessment is designed to work at different levels we ask you to define what happens in your ‘organisation’ in many of the questions. Organisation in this case refers to the coherent entity on behalf of which you are responding so it may be your programme team, your department or your whole institution. We also ask about consistency across your ‘component areas’ and practices at ‘local level’. At institutional level component areas/ local level will generally be schools or faculties, at department level they may be programmes of study and at programme/ course level they may be individual modules.
In many cases it may not be possible for an individual to answer all the questions. Indeed, we suggest that the self-assessment should be done as a group exercise because the dialogue that ensues is the first stage in the change process.
What will the self-assessment tell you?
The self-assessment tool will give you a report rating you at one of five levels against a range of headings:
- Strategy/policy and quality assurance
- Curriculum data
- Processes and working practices
- Technology
- Culture
- Student experience
A diagrammatic representation will serve as a quick overview of strengths and weaknesses.
An onscreen report will give you a set of suggested actions intended to help you build on your strengths and address your limitations and it will also provide links to resources that might help you carry out the suggested actions. The resources may be in the form of practical guidance, checklists and templates from other institutions or case studies and examples of practice.
To obtain an email copy of the report scroll to the bottom of the screen and enter your email address.
How should you use the self-assessment outcomes?
The tool has been tested with a range of experts from different institutions. They reported that the outcomes matched their own understanding of their strengths and weaknesses and that the suggested actions fitted with their experience of how to make progress in each of these areas.
Every institution is of course unique and your own local knowledge will be needed to put our generic guidance into context.
Suggested actions
Usually you will be offered a range of suggested actions and resources for each development area. You can have confidence that all of these approaches have worked for others but it will be up to you to decide which are best suited to your particular circumstances. As well as using the specific resources suggested you will find additional support in our guide to Transforming assessment and feedback with technology.
Depending on where you sit within the institution not all of the suggested actions will be within your remit to implement. Some actions such as those requiring clearer definition of policy or improvements to business processes may need support from a higher level of authority. Nevertheless the self-assessment outcomes will provide useful evidence of need for you to begin the necessary conversations.
Data validity
There will always be some concerns about the validity of data from a self-reporting exercise such as this. Different people may have a different interpretation of some of the questions and responses and it is only through dialogue that such differences can be explored and enhance the institutional knowledge base. Most of those involved in the collaborative development of this tool found the dialogue instigated by the self-assessment process to be the most valuable aspect of the activity.
We suggest the tool is best suited to supporting continuous improvement rather than any kind of benchmarking.
Whether your approach to developing EMA capability is top-down or bottom-up and whether you are a policy maker or a practitioner, you will probably find that you want to compare results from different parts of your institution. This will help you target staff development, select areas to pilot new approaches and identify good practice that can be built upon.
What should you be aiming for?
Our five level scale reflects the increasing use of EMA bringing further benefits for students and institutions. It should however be viewed with a number of caveats.
Non-linear scale
The scale is not a simple linear one. The first two levels are quite similar in terms of the user experience. You may correspond to the researcher level because your institutional practice is disjointed and people do not have a clear idea what others are doing. However, the overall user experience may not be significantly different to that of institutions at the explorer level.
Progress through levels
Institutions have also reported that the amount of effort needed to move between levels is not equally distributed. The most significant amount of effort is needed to get from the early stages to the embedding and enhancing levels. Once there, further progress is proportionately easier to achieve.
Progress through the levels is associated with evidence of greater benefits but that is not to say that every institution will necessarily be aiming to reach the highest level. In some cases institutions may provide an excellent student assessment experience in spite of constraints on how much they can invest in supporting information systems.
A co-design initiative
The self-assessment tool was developed using our co-design approach and we are particularly grateful for participation from the following institutions:
Anglia Ruskin University |
Aston University |
Birmingham City University |
Manchester Metropolitan University |
Plymouth University |
University of Bradford |
University of Edinburgh |
University of Hull |
University of Nottingham |
University of Sheffield |
University of Southampton |
University of York |
Online exams: migration or transformation?
Our latest blog post is a guest contribution by Stuart Allan who has just completed an MSc in Digital Education at the University of Edinburgh. Stuart has been undertaking research into online exams. This is a short reflection on his research and Stuart will be joining us for a webinar on the topic in September so watch this space for further details.
I’ve been interested in exams and how they might relate to learning ever since my undergraduate days, when my degree was decided by nine finals in the space of ten days. (I still have nightmares about them…)
So when I was thinking of a topic for my (MSc in digital education) dissertation research, I wondered how useful digital technologies might be in final, high-stakes exams. As I read more, I discovered that the published literature on the specifically educational (as opposed to administrative or technical) implications of online exams was actually very small. (Myyry and Joutsenvirta (2015) is an interesting place to start.)
I found that the use of online exams often follows one of two main approaches:
- migration (transposing traditional exams to digital environments in order to achieve organisational gains, e.g. improved efficiency), and
- transformation (using digital technologies as a catalyst to redefine summative assessment and align it with the perceived needs of contemporary students).
My main focus was on how the migration and transformation approaches translated into educational practice in particular contexts. I interviewed eight higher-education staff involved in designing, developing and delivering online exams across four countries. They talked at length about their experiences, beliefs, aspirations and frustrations.
Instead of finding one approach to be better than the other, I concluded that both the migration and transformation approaches had significant shortcomings. The migration view seems to assume that online exam environments are instruments that we can use to achieve pre-ordained aims (such as improved efficiency); however, in my interviews I found examples of technologies interacting with, and having significant implications for, educational practice. The sociomaterial perspective was very useful here (see Bayne 2015 and Hannon 2013).
I also found the transformation view to be problematic in its own ways. For instance I began to question the validity of claims that online exams are a logical response to society’s changing needs, and to suggest that a more detailed understanding of the ways in which online exams might be qualitatively different to traditional exams is required.
Moreover, I discovered a potentially hazardous assumption that traditional exams could be migrated online (or be ‘a little bit digitalised’, to borrow one interviewee’s expression) as a prelude to more ambitious and educationally motivated changes further down the line. This transition appears not to be as straightforward as some might believe, and the migration stage often requires practitioners to overcome challenges that are unexpectedly time-consuming and financially draining.
One of the things I found most interesting was the apparent strength of some university professionals’ conviction that online exams must comply with exactly the same conditions – in terms of invigilation, the types of questions asked and candidates’ access to course materials, notes etc – as traditional pen-and-paper tests. To a large extent these assumptions set the tone for how the participants in my research used online exams.
With this in mind, I produced a number of questions that practitioners working with online exams might wish to consider:
- In your institution, what motivations exist for pursuing online exams, understood particularly in terms of how educational goals are defined at institutional and programme-specific levels?
- What assumptions are being made about what is meant by an ‘online exam’ within your context, and what can be done to support a constructive dialogue around these?
- To what extent does the dialogue between educational practice and the material contexts of particular digital environments result in online exams that are qualitatively different from traditional tests? For example, do online exams actively support, alter or proscribe particular types of student responses?
- In what ways might online exams be used to support increased assessment authenticity, in terms of both the context and content of examination tasks?
Lastly, I’d argue that the term ‘online exam’ itself – and all the assumptions about technology, education and assessment that seem to underpin it – might constrain the potential for developing practice to an unacceptable degree (see Gillespie 2010). Do we need to invent a new term to describe the summative assessment activities of the future? If so, what might that term be?
Recommended reading
Bayne S. (2015) ‘What’s the matter with “technology-enhanced learning”?’, Learning, Media and Technology, 40 (1), pp. 5–20.
Gillespie T. (2010) ‘The politics of “platforms”’, New Media and Society, 12 (3), pp. 347–364.
Hannon J. (2013) ‘Incommensurate practices: sociomaterial entanglements of learning technology implementation’, Journal of Computer Assisted Learning, 29, pp. 168–178.
Myyry L. and Joutsenvirta T. (2015) ‘Open-book, open-web online examinations: developing examination practices to support university students’ learning and self-efficacy’, Active Learning in Higher Education, 16 (2), pp. 119–132.
Link to dissertation abstract: https://stuartallanblog.wordpress.com/dissertation-abstract/
Find me on Twitter: https://twitter.com/GeeWhizzTime
New resources for FE and skills
We’re pleased to announce the launch of a new guide on technology-enhanced assessment and feedback which has a focus on FE and skills. This new guide complements our existing suite of resources on technology in assessment and feedback, providing an up-to-date snapshot of practice across this diverse sector. The guide contains a rich variety of case studies and includes a podcast by Jayne Holt, assistant principal of Walsall College, on integrating college systems to track learners’ progress.
The message from the guide is clear: organisations, teachers, trainers and learners can all benefit from a technology enhanced approach to assessment and feedback. However, our e-assessment survey report (May 2016) reveals that some are yet to appreciate the full benefits. The overall picture, particularly for tracking systems and e-portfolios, is mixed, as is the ability of organisations to integrate their various technologies to maximise potential efficiency gains. For this reason, our latest guide to assessment and feedback focuses specifically on the needs of the FE and skills sector, with our assessment and feedback lifecycle model as its starting point.
You may also like to know our report on the evolution of FELTAG (June 2016) is now available. This offers further examples of effective technology-enhanced practice from colleagues across FE and skills sector.
Supplier responses to UK HE EMA system requirements
We are delighted that 16 suppliers have so far responded to the Jisc and UCISA invitation to complete our EMA system requirements template.
The template was designed to clarify the core requirements of UK He providers. The responses will help universities and colleges better understand the ways in which particular systems can support their needs and to see how particular combinations of systems might work together. The template is based on our assessment and feedback lifecycle and is intended to be used in conjunction with our guide to EMA processes and systems.
Download the attached Excel spreadsheet (see below) to view a summary table and all of the individual responses.
Please note the following caveats:
- The data is based on self-reporting by the suppliers. Neither Jisc nor UCISA has tested the products and inclusion in this listing should not be taken as an endorsement of particular products
- The individual responses are recorded under supplier name not product name. If you are having trouble finding a product check you have the correct supplier. For example: Banner = Ellucian; Canvas = Instructure; Moodle Coursework = University of London; URKUND = Prioinfocenter
- The list of requirements relates solely to EMA and may not represent the full functionality of the systems included here eg student record systems cover many functions other than assessment
- This listing includes products intended to cover most of the EMA lifecycle as well as some more niche products. It is intended as a means of identifying which combination of products could meet your needs. It is not a like-for-like comparison of similar systems.
We will continue to update this information from time to time to cover additional products or new releases of the featured products, and will list any updates along with the original. Suppliers wishing to submit new or updated information should contact Lisa Gray lisa.gray@jisc.ac.uk
You are welcome to use the comment facility on this blog to discuss and share information about the effective use of all of these products and particularly their interoperability to help others.
Original responses
February 2016: EMA System Requirements Supplier Responses 2016 02 V1
Updates:
August 2016: EMA-System-Requirements-Supplier-Responses-2016-09-V2
Getting started with small scale EMA
Not all universities or colleges are currently implementing EMA organisation-wide. Some of you have told us you are looking for guidance on how to get started with EMA on a smaller scale.
In this podcast Bryony Olney from the University of Sheffield talks about how she went about organising an EMA pilot in her department.
You can also download a transcript of the case study.