Digital Exams in Engineering Education
Digital exams are rather uncommon in engineering education because general e-assessment platforms lack the ability to use advanced item types that mimic general engineering problem-solving processes and award partial scores. However, it is possible to develop such advanced items with Maple T.A.. We describe how such items are structured in scenarios and developed for a second year bachelor’s-level material science course that ran three times at the Delft University of Technology. We evaluate how these items function in practice, are scored and perform from an educational measurement perspective. The paper discusses the results of the study and future directions for development of digital exams in engineering courses.
KeywordsDigital exam e-Assessment Question partitioning Partial credit Scenario Engineering education Maple T.A.
- 5.Lawson, W.D.: Reliability and validity of FE exam scores for assessment of individual competence, program accreditation, and college performance. J. Prof. Issues Eng. Educ. Pract. 133, 320–326 (2007). https://doi.org/10.1061/(ASCE)1052-3928(2007)133:4(320)CrossRefGoogle Scholar
- 6.Hambleton, R.K., Jones, R.W.: An NCME instructional module on: comparison of classical test theory and item response theory and their applications to test development. Educ. Meas. Issues Pract. 12, 38–47 (1993). https://doi.org/10.1111/j.1745-3992.1993.tb00543.xCrossRefGoogle Scholar
- 7.Ashton, H.S., Youngson, M.A.: Creating questions for automatic assessment in mathematics. LTSN MSOR Maths CAA Series, pp. 1–11 (2004)Google Scholar
- 9.Draaijer, S., Hartog, R.: Design patterns for digital item types in higher education. E-J. Instr. Sci. Technol. 10, 1–31 (2007)Google Scholar
- 10.Hartog, R., Draaijer, S., Rietveld, L.C.: Practical aspects of task allocation in design and development of digital closed questions in higher education. Pract. Assess. Res. Eval. 13, 2–15 (2008)Google Scholar
- 15.Parshall, C.G., Harmes, J.C.: The design of innovative item types: targeting constructs, selecting innovations, and refining prototypes (2008)Google Scholar
- 16.Draaijer, S., Van Gastel, L., Peeters, V., Frinking, P., Reumer, C.: Flexibilisering van Toetsing [Flexibility in Testing and Assessment]. Digitale Universiteit, Utrecht (2004)Google Scholar
- 17.van Someren, M.W., Barnard, Y.F., Sandberg, J.A.C.: The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes. Academic Press, London (1994)Google Scholar
- 18.Goedee, C., Keijzer-de Ruijter, M., Offerman, E.: Digitaal toetsen van Engineering Lesstof [Digital testing of Engineering Instructional Materials]. Examens. Tijdschr. voor Toetpraktijk., 16–24 (2017)Google Scholar
- 19.Newell, A., Simon, H.A.: Human Problem Solving. Prentice-Hall, Upper Saddle River (1972)Google Scholar
- 20.Spilt, J.L., Leflot, G., Onghena, P., Colpin, H.: Use of praise and reprimands as critical ingredients of teacher behavior management: effects on children’s development in the context of a teacher-mediated classroom intervention. Prev. Sci. 17, 732–742 (2016). https://doi.org/10.1007/s11121-016-0667-yCrossRefGoogle Scholar
- 25.McGuire, G.R., Youngson, M.A., Korabinski, A.A., McMillan, D.: Partial credit in mathematics exams - a comparison of traditional and CAA exams. Presented at the 6th International Computer Assisted Assessment Conference, Loughborough, Loughborough University, UK (2002)Google Scholar