Keywords

1 Introduction

An eExam (e-exam) is a “timed, supervised, summative assessment conducted using each candidate’s own computer running a standardised operating system” [1]. This distinguishes them from online assessments, learning platform-based assessment environments or web-based tests.

Several authors have used the ‘eExam’ terminology. Held referred to examinations conducted through a learning management system as eExaminations or eExams in 2011 [2]. In 2012, Breke studied eExams in calculus at university level in Norway but focused on the MyMathLab application rather than a dedicated and common operating system [3]. More recently, the Rector’s decision at Jyväskylä University referred to electronic exams as eExams [4]. Perhaps the earliest mention in the sense of the Wikipedia definition was by Hesketh in 2010 [5] when lodging code in Launchpad: “eExam aims to create a restricted Ubuntu environment in which students may perform exams on their own laptops”.

These examples begin to show the global proliferation of eExams and similar ways in which computers are being used by candidates in high stakes assessments. This paper reports eExam developments in a number of countries. The paper shows how a desk audit of the literature identified a range of national contexts for study. This was followed up by field observations and interviews with eExam management teams to gather more specific data. A thematic analysis of this material is then presented, with a synthesis of findings into a table of key indicators/features/attributes. A discussion of the findings draws out important implications for the future.

2 Literature

eExams have been reported in several countries. Nigerian universities are collectively using various eExam approaches for selecting and assessing undergraduates [6]. John Dermo carried out a broad survey of student eAssessment perceptions in the University of Bradford, England [7]. eExams in Turku, Finland were described by Kuikka et al. [8] in the context of ‘aquariums’ – rooms covered by security video cameras in which students take assessment on computer without personal supervision. Bussières, Métras and Leclerc reported use of the commercial software ‘ExamSoft’, in pharmacy courses in Canada [9]. Forty-two American states require the use of ExamSoft by those who wish to take the bar (law) exam on a computer [10]. Examsoft supports choice answers and a simplified word processor for longer (non-automatically marked) responses, and is suitable for paper-replacement assessments where candidates can choose between keyboard and pen. A similar, but open source product is TCExam (tcexam.org) as used in the University of Innsbruck [11].

In Australia, the authors are involved in a nation-wide project called Transforming exams: a scalable examination platform for BYOD invigilated assessment [12]. The increasingly large enrolments in tertiary classes and the reduction in public funding in most ‘western’ nations means that educational institutions are no longer able to keep up with the demand for computer provision. This is true for campus-based computer lab where the ratio of university-supplied lab computers per student is dropping while demand for ICT continues to grow. This has engendered a strategic shift towards the use of bring-your-own-device (in this case, laptops) in many higher education institutions in more developed countries. Institutional policies and IT services are increasingly supporting BYOD for students within the pre-tertiary education sector as well. Consequently, it seems most likely that BYO laptop based approaches will be the only viable way forward for large-scale examinations. We therefore focus the majority of our analysis on solutions that adopt a BYOD approach to equipment provision.

eExams are considered to have several important advantages. On a cognitive level, e-exams promote effective learning by facilitating the testing a range of skills, knowledge, and understanding [13]. These pedagogical advantages stand in contrast to the administrative advantages reported for eExams, such as providing instant feedback to students and reducing load on staff [ibid.]. Other advantages of eExam systems include ease of use, low cost to operate, and ability to improve the quality of student feedback [14]. E-exams also offer several benefits over paper-based examinations as these systems allow multimedia elements including video, virtual views, scenarios, software tools and simulations [15]. Supporters see them enabling a broad pedagogical landscape for the assessment of 21st Century capabilities. In this regard, post-paper assessments become possible – assessments that cannot be delivered in the conventional paper-based context because they incorporate multi-media or require creative use of computer software applications. Such post-paper assessments may also influence curriculum, moving teaching towards the ‘redefinition’ end of the SAMR framework [16]. This will provide an impetus for educators to incorporate creative computer use into instruction, increasing the level of student cognition in Bloom’s taxonomy [17].

Dawson provided an interesting alternative view to these reasons for adopting eExams [18]. He considered five threats to exam security, including injecting prepared text into the system, or copying the question paper and software using a ‘cold boot’ attack that could achieve the same outcome. For institutions where exam questions are confidential for reuse, the latter raised some concerns (although the cold boot attack did require cooling computer memory to temperatures below zero Celsius). Other institutions publish exam papers through their libraries, so the attack was not significant. In addition, the copying of software or electronic exam files could be instrumental to hacking into the security of the system. Security reliant on obfuscation was rejected as far back as 1851 [19], which is no less valid today in the world of computers. Sindre and Vegendla [20] took a more holistic approach to eExam security, using attack-defence trees to argue they are no less secure than paper-based exams. Further they argued that for e-exams to be acceptable they only need to be ‘not worse’ than paper based exams. Indeed eExams do offer additional affordances, as outlined earlier with respect pedagogical flexibility when compared to paper-based exams. A computer-based exam is also more reflective of knowledge production, use and problem solving in contemporary work and society.

3 Method and Approach

This study examined the design approaches taken by a variety of publically funded eExam projects and commercial competitors. The procedure comprised a desk audit, followed by observational visits and interviews with staff actively implementing eExams in schools and universities. The objective is to ascertain design characteristics that might foster success in the academic ecology of these educational environments over the broadest possible range. Success for an eExam system is very widely defined within this study, conveying sufficient desirable characteristics for the system to be chosen alongside, or in preference to, pen-on-paper examinations. These characteristics often include resilience, reliability, capability to handle many questioning styles etc. The functionality of an eExam system needs to be considered in a holistic fashion, looking beyond the context of the candidate providing responses to questions. Thus, the way assessors compose questions is an important consideration, as is how answer scripts are reticulated to markers, and marks consolidated with other assessment components into the institutional repository of student achievements.

The desk audit considered a range of eExam system reports from around the world, many of which have been cited in the previous section. From this audit, the following characteristics and concerns were distilled (see Table 1). These were then listed as issues for further investigation. This issues list acted as an up-front frame for observational visits, and as prompts when conducting field interviews.

Table 1. Focus areas for investigation in field work

Following the desk audit, visits to a number of universities and school examination boards were arranged. Where possible, an observation of an eExam was conducted, and interviews held with responsible staff in settings they chose, to gather more information about design considerations using the questions from Table 1. Where necessary, funnelling was used to probe specific issues and identify areas considered important by respondents. These interviews were transcribed into a standard template, and respondents provided an opportunity to correct the text. Key themes and elements from the field observations and interviews were extracted from the observations and interview records using the three-step Interpretative Phenomenological Analysis process [21]. First, the observations and interview statements were grouped into clusters. Second, these clusters were condensed into themes, and finally the themes were tabulated as key features for the different systems.

4 Findings

Data were collected from both school and university contexts. This paper presents a sub-set of the data that were chosen to represent a range of different national contexts, and a comparison of school and university sectors within a single country. The four systems illustrate the tension between BYOD and institutional hardware provision.

Table 2 illustrates the similarities and differences for just four of the eExam systems investigated. The table provides some minimum technical details about each system, alongside more overarching detail on the extent of use, place of origin and support for post-paper assessments.

Table 2. Key features of some eExam systems

This table can be read in conjunction with a table of Digital Exams in Scandinavian countries [22], which lists many internal and commercial eExam systems. Increasingly, educational organisations are tending to encourage the use of bring your own device (BYOD) eExam systems because this is the only financially viable way of providing every candidate with a reasonably modern computer. Although computer laboratories are used by some eExam systems, these cannot provide sufficient candidate seats when scaled to the full deployment of exams in relatively short times. In addition, computer laboratories may have been designed to facilitate collaboration, whereas examinations generally require candidates to be isolated from one another.

A clear security difference emerges between systems that boot from USB or Ethernet, and those that boot from the internal hard drive of the client computer. The former are more prevalent with scalable BYOD platforms, while the latter are restricted to institutional equipment. All approaches attempt to provide institutional control over the assessment context for the duration of the examination, to ensure integrity.

The affordances of the four systems illustrate another difference. Most of them facilitate selected response questions (multiple-choice, fill in the blank, True/False, matching). Others are browser-based, so can only offer a simplified word processor without the rich toolset candidates are accustomed to using. Finally, three of the four systems allow candidates to use sophisticated software tools beyond these two affordances, which makes possible the posing of creative questions requiring higher order thinking and complex constructed responses. However, the Abitti system only allows a screenshot from the software tool to be submitted, where the eExam system and SEE permit candidates to return digital artefacts and data files. Examples could include a formatted report containing charts and tables, an engineering schematic within a computer aided design (CAD) file, a spreadsheet file containing formulae or a working computer program written using Python.

These advanced affordances can lead to more authentic assessment that mirrors real world creative problem solving, within the constraints of a fair, time-bound examination. This rich pedagogical landscape offers a mechanism for eExams to influence curriculum transformation, but is in sharp contrast to the administrative convenience of other systems supporting restricted question types. Most of the interviewees saw little impact on curriculum, indicative of the long road ahead before this tension is resolved.

Finally, the penetration levels of the systems vary from less than 1% to 100% (by 2020). The explanation appears to be in the strategic thinking on the part of institutions and leadership within each national context. Where a strong direction to proceed with eExams at the national level has been set, a high level of penetration of eExams can be achieved over a small number of years. Otherwise, external threats from the environment (such as ‘contract cheating’) may be the only alternative impetus that can achieve transformational change in a similar timescale.

5 Discussion

Within this diversity was a lack of consistency in the relationship between schools and universities. Finland had strong but separate eExam systems in schools and universities. From a student perspective, a consistent approach to high-stakes assessment might be considered less stressful.

The findings show a movement to BYOD solutions, probably because these are economic for the institution, and scalable to a large number of students for cohort-wide examinations. Similarly, eExam implementations with higher penetration can be used at any location. All three of the externally booted solutions used a version of the Linux operating system, but there were diverse ‘flavours’. These all supported the use of software tools, whereas the browser-based solution did not. The trend was towards open-source software may be associated with greater security confidence or local appetite towards fostering homegrown innovation rather than limiting adoption to ‘off-the-self’ solutions. The autosave period was a useful way of assessing the reliability of the systems but in many cases, this could be configured to taste and may also be linked to the risk appetite of system owners.

One of the most striking findings of this study was the diversity of approaches to eExams. We understand that such diversity can be expected from the outset of such an innovation, but as with telephones and computers, a convergence will emerge in the future. Beneficial characteristics will be adopted more widely, and designers will integrate these into their products.

Table provides some inspiration for successor systems which may be expected to prioritise the more favoured affordances discovered in this data. Within-exam software tool access may become more widely available, and autosave periods are likely to come down. International offerings of open-source material will need to be poly- or multi-lingual. A wider menu of question types can be expected, which also embrace data-file submissions from the creative use of in-assessment software tools.

6 Conclusion

Change can often be stressful. The initial investigation showed many concerns about the introduction of eExams (Table 1), and a diversity of technological approaches under development (Table 2). With so many diverse approaches to eExam system development, there appears to a need to facilitate greater collaboration amongst eExam system developments and users. This would foster the sharing of productive features and strategies to enhance security, reliability and assessment capabilities.

Missing from the data gathered were the views of laptop computer makers and assessment policy officials. Computer makers appear to be crucial to eExam system developers because their future roadmaps can permit or hinder particular technical approaches. For instance, the secure exam environment from Austria requires an Ethernet port on each candidate computer. However, recent equipment put on sale from Apple has only a single USB-C port. Windows 10 incorporates a secure boot feature, which makes it difficult for general users to follow a standard procedure to boot from an alternative operating system. Manufacturers are tending to ring-fence their software ecosystems, partially to protect financial interest, but also to improve equipment reliability for customers. Similarly further work is needed to assess the strategies used when implementing eExams. Providing contextual support and training for teachers and administrators may enable them to move beyond replication and augmentation towards better utilising the power of technology in educationally transformative ways.

Assessment policy officials are sponsoring trials of eExam systems in many institutions. The driving forces of eExam adoption are currently unclear or are masked by competing priorities. Many interviewees considered the administrative convenience of their eExam systems, as well as the technological affordances. The administrative benefits were digital reticulation of questions and answer scripts, and in many cases, the marker time saved by automatic assessment. Technological affordances were seen as potentially transformative of curricula, but there was scant evidence of this happening in practice. Further study of the impact of eExam adoption on assessment and curriculum design is urgently required.