Intelligent On-line Exam Management and Evaluation System

  • Tsegaye Misikir TashuEmail author
  • Julius P. Esclamado
  • Tomas Horvath
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11528)


Educational assessment plays a central role in the teaching-learning process as a tool for evaluating students’ knowledge of the concepts associated with the learning objectives. The evaluation and scoring of essay answers is a process, besides being costly in terms of time spent by teachers, what may lead to inequities due to the difficulty in applying the same evaluation criteria to all answers. In this work, we present a system for online essay exam evaluation and scoring which is composed of different modules and helps teachers in creating, evaluating and giving textual feedbacks on essay exam solutions provided by students. The system automatically scores essays, semantically, using pair-wise approach. Using the system, the teacher can also give an unlimited number of textual feedbacks by selecting a phrase, a sentence or a paragraph on a given student’s essay. We performed a survey to assess the usability of the system with regard to the time saved during grading, an overall level of satisfaction, fairness in grading and simplification of essay evaluation. Around 80% of the users responded that the system helps them to grade essays more fairly and easily.


Automatic essay evaluation Automatic feedback Intelligent tutoring Learning assessment 


  1. 1.
    Leckie, G., Baird, J.-A.: Rater effects on essay scoring: a multilevel analysis of severity drift, central tendency, and rater experience. J. Educ. Meas. 48, 399–418 (2011)CrossRefGoogle Scholar
  2. 2.
    Kaya, B.Y., Kaya, G., Dagdeviren, M.: A sample application of web based examination system for distance and formal education. Procedia - Soc. Behav. Sci. 141, 1357–1362 (2014)CrossRefGoogle Scholar
  3. 3.
    Yağci, M., Ünal, M.: Designing and implementing an adaptive online examination system. Procedia - Soc. Behav. Sci. 116, 3079–3083 (2014)CrossRefGoogle Scholar
  4. 4.
    Tashu, T.M., Horvath, T.: Pair-wise: automatic essay evaluation using word mover’s distance. In: The 10th International Conference on Computer Supported Education - Volume 2: CSEDU, pp. 59–66. SciTePress (2018)Google Scholar
  5. 5.
    Attali, Y.: A differential word use measure for content analysis in automated essay scoring. ETS Res. Rep. Ser. 36, 1–19 (2011)Google Scholar
  6. 6.
    Page, E.B.: Grading essays by computer: progress report. In: Invitational Conference on Testing Problems (1966)Google Scholar
  7. 7.
    Thomas, P., Haley, D., deRoeck, A., Petre, M.: e-Assessment using latent semantic analysis in the computer science domain: a pilot study. In: Proceedings of the Workshop on eLearning for Computational Linguistics and Computational Linguistics for eLearning, pp. 38–44. ACL (2004)Google Scholar
  8. 8.
    Foltz, P.W., Laham, D., Landauer, T.K.: Automated essay scoring : applications to educational technology. In: World Conference on Educational Multimedia, Hypermedia and Telecommunications (ED-MEDIA) (1999)Google Scholar
  9. 9.
    Kusner, M.J., Sun, Y., Kolkin, N.I., Weinberger, K.Q.: From word embeddings to document distances. In: International Conference on Machine Learning (2015)Google Scholar
  10. 10.
    Mikolov, T., Chen, K., Corrado, G., Dean, J.: Distributed representations of words and phrases and their compositions. In: NIPS (2013)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of Informatics, Department of Data Science and Engineering, Telekom Innovation LaboratoriesELTE-Eötvös Loránd UniversityBudapestHungary
  2. 2.Faculty of Informatics, 3in Research GroupELTE-Eötvös Loránd UniversityMartonvásárHungary
  3. 3.Faculty of Science, Institute of Computer SciencePavol Jozef Šafárik UniversityKošiceSlovakia
  4. 4.Cagayan de OroPhilippines

Personalised recommendations