Abstract
In order to verify common findings in the literature regarding the conception of e-assessment among students, we carried out a survey based on common findings. Our survey, which has been carried out over several higher education institutes, enhances the already existing findings by adding new facets. The achieved results are promising in that students seem to be open-minded regarding e-assessment, which is in line with the findings in the already existing literature. However, there are some open points that have to be resolved in a reliable way in order to completely convince the students of the opportunities offered by e-assessment.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
If e-assessment is to be introduced into the examination system of an institute of higher education (IHE), it is not only the staff of the latter that have to accept this type of assessment, but also the students [1], especially when it comes to e-assessment on students’ devices (BYOD) [2]. Therefore, it is of interest for IHEs that are willing to introduce e-assessment, to be aware of the possible limiting factors from the students’ points of view, to tailor the e-assessment system and the process of integration to the students’ needs.
To verify that the findings regarding the students’ points of view in the literature are valid for our institution, we carried out our own survey about e-assessment, BYOD scenarios and cheating in examinations.
The paper is structured as follows: in the second section, we give a brief overview of the findings already presented in the literature. In the third section, we discuss the setup of our survey, followed by a discussion of the achieved results in the fourth section. The paper closes with a summary and an outlook.
2 Related Research
There is a lot of literature about students’ perceptions of e-assessment, which has been written over the last years. Most of these papers focus on a particular IHE, e.g. Saudi Electronic University, Saudi Arabia [3] and Dow University of Health, Pakistan [4]. Some papers focus even on a single study course, e.g. Polytechnic Institute of Porto, Portugal, Marketing Degree [5], University College London, UK, Chemical Engineering [6], Hong Kong Polytechnic University, Hong Kong, Rehabilitation Sciences) [7] and Kocaeli University, Turkey, Desktop Publishing [8]. The findings reported in these papers testified generally positive students’ attitudes regarding e-assessment.
For the course of this paper, the most important publication is “e-Exams with student owned devices: Student voices” by Hillier [2], since his paper focuses on a BYOD scenario. There are many interesting findings about students’ perceptions not only regarding e-assessment, but especially about their perceptions regarding e-assessment on their own devices. However, even Hillier’s research was conducted only in one IHE.
3 Design of the Survey
We constructed our survey based on the findings in a previous paper [2], to answer our research question: Which factors influence students’ perceptions of e-assessment?
We anticipated that the perception of e-assessment is influenced by:
-
gender
-
age
-
the study programme (science, technology, engineering and mathematics (STEM) versus humanities, for example)
-
technology affinity
-
the stage of study (Bachelor versus Master level)
Since we expected the results to be additionally influenced by the general technology affinity of the students, we incorporated another questionnaire as part of our survey to be able to distinguish technology accepting students and technology reluctant students. This questionnaire is the TA-EG questionnaire by Karrer et al. [9], which is designed to measure technology affinity. The items of the TA-EG questionnaire have been reordered to eliminate effects that could originate from the clustered answers of the original questionnaire. Additionally, unlike existing surveys, we wanted to carry out the survey at multiple IHEs and for different study courses.
Altogether, this resulted in the, originally German, survey as shown in Table 1. The survey was carried out mainly with students of RWTH Aachen University and FH Aachen University of Applied Sciences, but also students at Maastricht University, Alpen-Adria-University Klagenfurt, TU Berlin, FOM Hochschule für Oekonomie und Management (Study Centre Aachen) and Albstadt-Sigmaringen University were invited to participate. The study programmes mentioned explicitly in the survey are the main study programmes, which are related computer science courses at those universities.
4 Analysis of the Results
In total, 408 students responded to the survey with demographics as shown in Table 2.
About three quarters of the participating students were male and one quarter were female. A similar distribution can be seen for the age, where about three quarters were aged between 18 and 25 years and nearly a fifth of the students were aged above 25 years.
The students came from a variety of study programmes, as can be seen from Fig. 1. Other programmes of study included artificial intelligence, engineering and physics. So, despite individual students studying in programmes like economics and literature, the vast majority of the study programmes were related to a STEM topic. Therefore, it is not surprising that the results of the TA-EG questionnaire did not allow for identifying subgroups with different affinities regarding technology.
The plots in Fig. 2 refer to the original grouping of the TA-EG questionnaire, which has four groups: Enthusiasm, Competency, Positive Attitude, and Negative Attitude. The five subplots in each of these plots refers to a question in the corresponding group of questions of the TA-EG questionnaire. Please note that every item in the TA-EG questionnaire in our survey used a five-level Likert scale ranging from 1 (“Strongly agree”) to 5 (“Strongly disagree”). The overall variance of these items was 0.76, which accounts for the indistinguishability of different subgroups.
Since there were too few students enrolled in a study course that were not from the STEM field, the collected data were not suitable to answer whether the study course influenced the students’ perceptions of e-assessment. The absence of those students may be caused by the decision to carry out the survey via an online portal, which may have biased the results so that only students participated who had an affinity for technology. However, that cannot be concluded from the data.
4.1 Influences of Gender and Age
To examine the influence of gender, age, and study level (bachelor or master) the data set was split into subsets accordingly. These subsets were then tested for significant differences with a Fisher test [11]. The results for the Likert-scaled questions can be found in Table 3.
Given these p-values, conclusions about the influence of gender, age and study level are possible to a certain extent. Regarding question E1, it seems that women are more hesitant to accept e-assessment as part of the examination system. In addition, students between 18 and 25 years seem to be more positive about e-assessment than students of other ages. For question E2, the age again makes a difference, as students older than 25 years seem to be less convinced that e-assessment is a good complement to paper-based examinations compared to younger students. The same tendency is revealed when considering the question about whether the study level influences the perception of e-assessment. Students that are enrolled in a master’s programme seem to be more reluctant regarding e-assessment than students in a bachelor’s programme. If this tendency is caused by the progress in the studies or by age, again, is a crucial factor that cannot be concluded from the data, which is shown in Fig. 3.
5 Discussion
The achieved results from the survey show a rather clear picture. The students would like to have electronic examinations in their studies, but not necessarily as a replacement for paper-based examination, but rather as a complementary approach, as the answers of questions E2 and E3 (see Fig. 4) suggest. This perception of e-assessment is understood through advantages, which cover topics like faster correction (75.98%, E4), more realistic assignments (62.74%, E4), more diverse examination tasks (45.34%, E4), and readability (which was stated in free text comments). However, students are also concerned about disadvantages, like security (41.67%, E5), usability (42.64%, E5), and fairness (34.56%, E5). Additionally, technical difficulties and the subsequent loss of already solved assignments are mentioned very often in the comments. Overall, less than half of the students sees disadvantages in e-assessment; however, especially when it comes to a BYOD approach, the students are afraid that technical difficulties may lead to a handicap for them or that they have to have a capable device on their own. Still, the tendency seems to be positive regarding a BYOD approach (see B1 in Fig. 4), as students see the advantage of a familiar device (89.7%, B2). However, due to the reported concerns, it is very important to have a reasonable backup strategy for these situations. As we have discussed elsewhere [12], it is important to regularly have backups during an e-assessment, so that a student can simply switch to an emergency device provided by the IHE in case her own device breaks down. These emergency devices could also be used for students that cannot afford a device on their own in order to enable these students to participate in electronic examinations. Additionally, the topic of fairness is important to the students, as they state differences between the students’ devices as the main concern when utilising BYOD (82.84%, B3). Furthermore, topics like security or cheating are of importance for the students. The students are rather split about the risk of cheating in paper-based examinations; however, there is a tendency that students think that it is easier to cheat in electronic examinations (see C1 and C2 in Fig. 4). Therefore, new ways of reducing the risk of cheating in electronic examinations have to be found, as we have discussed elsewhere [13] and presented an approach to security [14].
Age is seemingly a factor; it does influence the perception of e-assessment in line with the concept of Digital Natives introduced by Prensky [15]. He claims that “[t]oday’s students have not just changed incrementally from those of the past”, but underwent a drastic change of attitude, because “the arrival and rapid dissemination of digital technology in the last decades of the 20th century [was] an event which changes things so fundamentally that there is absolutely no going back”. The evidence gained from the survey suggests a similar conclusion, because there is a statistically significant difference between students over the age of 25 years in comparison to younger students. Shelley White states in her article “The Generation Z effect” [16] that “Gen Z is loosely accepted as people born in the mid - to late-1990s and later. (According to the Pew Research Center in the United States, the last Gen Y was born in 1997, while Statistics Canada says Gen Z starts with people born in 1993)”. The timespan mentioned in her article is exactly in line with our findings of the age that has an influence on the perception of e-assessment.
Gender having an influence on the perception of e-assessment is actually not surprising, as many studies show that women seem to have a lower confidence in using technology in general than men, for example Kadijevich [17], Kahveci [18], and Yau and Cheng [19], whether this is justified or not. Therefore, it is reasonable to assume that the same tendency can be observed when examining the perception of e-assessment.
6 Summary and Outlook
In order to identify factors that influence students’ perceptions of e-assessment, we carried out our own survey based on the findings in a previous paper [2]. However, we extended our survey over multiple IHEs to gain a broader view. The results are promising, in that students seem to be open-minded regarding e-assessment, which is in line with the findings in the already existing literature. However, there are open points that have to be reliably resolved in order to convince the students completely of e-assessment. Therefore, more research is needed to uncover all the open questions that exist among the students as well as to find solutions to these open questions. Further research could also tackle the question as to whether affinity to technology and the field of study have a direct influence on the perception of e-assessment. In addition, it could be further investigated if the influence of the study level is indeed significant, due either to further progress in studying or if there is a hidden correlation between age and level of study.
References
Terzis, V., Economides, A.A.: The acceptance and use of computer based assessment. Comput. Educ. 4(56), 1032–1044 (2011)
Hillier, M.: e-Exams with student owned devices: student voices. In: Proceedings of the International Mobile Learning Festival 2015: Mobile Learning, MOOCs and 21stCentury learning, Hong Kong SAR, China, pp. 582–608 (2015)
Alsadoon, H.: Students’ perceptions of e-assessment at saudi electronic university. Turkish Online J. Educ. Technol. 16(1), 147–153 (2017)
Jawaid, M., Moosa, F.A., Jaleel, F., Ashraf, J.: Computer based assessment (CBA): perception of residents at Dow University of Health Sciences. Pakistan J. Med. Sci. 30(4), 688–691 (2014)
Babo, R., Azevedo, A., Suhonen, J.: Students’ perceptions about assessment using an e-learning platform. In: Sampson, D.G., Huang, R., Hwang, G.-J., Liu, T.-C., Chen, N.-S., Kinshuk, C.-C.T. (eds.) Proceedings of the IEEE 15th International Conference on Advanced Learning Technologies, Hualien, Taiwan, pp. 244–246 (2015)
Sorensen, E.: Implementation and student perceptions of e-assessment in a chemical engineering module. Eur. J. Eng. Educ. 38(2), 172–185 (2013)
Hodgson, P., Pang, M.Y.C.: Effective formative e-assessment of student learning: a study on a statistics course. Assess. Eval. High. Educ. 37(2), 215–225 (2012)
Özden, M.Y., Ertürk, I., Sanli, R.: Students’ perceptions of online assessment: a case study. J. Distance Educ. 19(2), 77–92 (2004)
Karrer, K., Glaser, C., Clemens, C., Bruder, C.: Technikaffinität erfassen – der Fragebogen TA-EG. Der Mensch im Mittelpunkt technischer Systeme. 8. Berliner Werkstatt Mensch-Maschine-Systeme, pp. 196–201 (2009)
Hintze, J.L., Nelson, R.D.: Violin plots: a box plot-density trace synergism. Am. Stat. 52(2), 181–184 (1998)
Upton, G.J.G.: Fisher’s exact test. J. Roy. Stat. Soc. 155(3), 395–402 (1992)
Küppers, B., Politze, M., Schroeder, U.: Reliable e-assessment with GIT - practical considerations and implementation. In: EUNIS 2017 Book of Proceedings, Münster, Germany, pp. 253–262 (2017)
Küppers, B., Kerber, F., Meyer, U., Schroeder, U.: Beyond lockdown: towards reliable e-assessment. In: GI-Edition - Lecture Notes in Informatics, vol. P273, pp. 191–196 (2017)
Küppers, B., Politze, M., Zameitat, R., Kerber, F., Schroeder, U.: Practical security for electronic examinations on students’ devices. In: Arai, K., Kapoor, S., Bhatia, R. (eds.) SAI 2018. AISC, vol. 857, pp. 290–306. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-01177-2_21
Prensky, M.: Digital natives, digital immigrants. On the Horizon 9(5), 1–6 (2001)
White, S.: The generation Z effect (2018). https://www.theglobeandmail.com/news/national/education/canadian-university-report/the-genz-effect/article26898388/. Accessed 15 Jan 2019
Kadijevich, D.: Gender differences in computer attitude among ninth-grade students. J. Educ. Comput. Res. 22(2), 145–154 (2000)
Kahveci, M.: Students’ perceptions to use technology for learning: measurement integrity of the modified Fennema-Sherman attitudes scales. Turkish Online J. Educ. Technol. 9(1), 185–201 (2010)
Yau, H.K., Cheng, A.L.F.: Gender difference of confidence in using technology for learning. J. Technol. Stud. 38(2), 74–79 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 IFIP International Federation for Information Processing
About this paper
Cite this paper
Küppers, B., Schroeder, U. (2019). Students’ Perceptions of e-Assessment. In: Passey, D., Bottino, R., Lewin, C., Sanchez, E. (eds) Empowering Learners for Life in the Digital Age. OCCE 2018. IFIP Advances in Information and Communication Technology, vol 524. Springer, Cham. https://doi.org/10.1007/978-3-030-23513-0_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-23513-0_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-23512-3
Online ISBN: 978-3-030-23513-0
eBook Packages: Computer ScienceComputer Science (R0)