Optimising Moodle quizzes for online assessments
- 210 Downloads
Computer-aided learning management systems (LMSs) are widely used in higher education and are viewed as beneficial when transitioning from conventional face-to-face teaching to fully online courses. While LMSs have unique tools for transferring and assessing knowledge, their ability to engage and assess learners needs further investigation. This paper focuses on a study examining the LMS “Moodle” to ascertain the effectiveness of “Moodle quizzes” to improve, assess and distinguish knowledge in a civil engineering course at an Australian university. The course has a database comprising 62 formative and 61 summative quiz questions with embedded text, images, audio and video. This study investigates the use of these quiz questions with four course cohorts and 169 students. The quizzes assessed competencies of students during various stages of a study period through automated marking. The suitability of questions to assess and distinguish student knowledge levels was determined using a psychometric analysis based on facility index (FI) and the discrimination index (DI) statistics embedded within the Moodle quizzes.
This study highlights strategies used to set and review quiz questions for formative and summative assessments. Results indicated that students were engaged and satisfied in the formative assessment because they viewed the interactive videos between 2 and 6 times and 65% of students attempted all the formative questions. The FI indicated student pass rate for the summative questions and DI indicated the difficulty of these questions, while the combination of FI and DI results separated students with different knowledge levels. Using these Moodle statistics provided information to make effective decisions on how to improve the summative quizzes.
The multimodal quizzes were effective in teaching and assessing a theoretical engineering course and provided efficient methods to replace conventional assessments. The FI and DI indexes are useful statistical tools in redesigning appropriate sets of questions. Time-poor academics will benefit from using these easily attainable Moodle statistics to inform decisions while revising the quizzes and making assessments more autonomous.
KeywordsOnline teaching Online assessments Moodle Online quizzes Psychometric analysis Facility index Discrimination index
Learning management system
The growth of online learning in recent decades has resulted in higher education institutes offering more online courses with up to 30% of American college and university students participating in at least one online course (Broadbent & Fuller-Tyszkiewicz, 2018; Liagkou & Stylios, 2018). Despite significant technological advances in online education tools, developing online course materials can be still challenging (Jackson, 2017). New technologies and educational design methodologies continuously redefine the role of professionals (Philipsen, Tondeur, Roblin, Vanslambrouck, & Zhu, 2019), therefore education institutes must ensure that educational needs of those professionals remain a primary concern. A main concern educators face is identifying the appropriate technology to develop online course materials for various disciplines (Salamon, Ali, Miskon, & Ahmad, 2016), as the requirements for digital resources can vary significantly between different disciplines (Martins, 2017; Sancho-Vinuesa, Masià, Fuertes-Alpiste, & Molas-Castells, 2018). Alongside learning and adjusting to new technologies, educators face challenges in developing resources which will successfully engage online users, working with students’ different knowledge levels and assessing the required course objectives, all while maintaining the quality of an institute’s graduates.
Various learning management systems (LMSs) and tools are available to develop digital resources for courses which were previously solely based on traditional face-to-face teaching. Research has identified Moodle as a complete and adequate platform for implementation in higher education (Aydin & Tirkes, 2010; Williams van Rooij, 2012). Moodle provides different user-friendly tools such as “quizzes”, “forums”, “databases” and “workshops” to develop various digital resources for teaching and assessment purposes. It is viewed as a best-practice instructional mode and that students who do not engage with blended learning are academically disadvantaged (Francis & Shannon, 2013). Moodle was ranked among the top 20 best LMSs based on user experiences in 2018 and 2019 (Andre, 2019; eLearning Industry, 2018).
Online quizzes in LMSs have been implemented for summative assignments, formative assessments and instructional design methods in diverse disciplines such as engineering, biology, medicine and the social sciences (Jaeger & Adair, 2017; Krause et al., 2017; Sullivan, 2016). Recent studies highlight the benefits of online quizzes and students’ positive attitude towards them (Cohen & Sasson, 2016; Wallihan et al., 2018). Such benefits include improving student motivation, enhancing understanding and active learning, and deterring cheating, as long as the quiz questions are not too easy (Cook & Babon, 2017). Carefully designed online quizzes can be one of many solutions to pre-empt student plagiarism by randomising questions, shuffling responses, providing timestamps and logs of multiple quiz attempts with systematic evaluation processes (Sullivan, 2016).
Furthermore, online quizzes can address student failure to correctly solve word problems. Word problems aim to connect mathematical problem-solving activities to real-world examples (Geary, 2017). Students, nevertheless, can have difficulties in constructing a mental model of such situations (Thevenot, 2010). One solution to this issue is to utilise mathematical word problems which allow students to make real sense of the situations described in the problems (Vicente, Orrantia, & Verschaffel, 2008). Placing the real-world situation before the mathematical problem is vital to understand the mathematical representation of the problem and integrate the information into real-world problems (Thevenot, Devidal, Barrouillet, & Fayol, 2007). Therefore, educators need to develop resources to assist students to develop a mental model to solve real-world problems represented by mathematical formula. This process can be challenging yet potentially addressed by using video-enhanced explanations of problems and feedback in the form of online quizzes (West & Turner, 2016). Quizzes with interactive videos enhance student performance giving the students an opportunity to view examples and analyse scenarios (Heo & Chow, 2005; Maarek, 2018).
One benefit of online tests is that feedback can be automatic and timely, providing students with immediate feedback on their learning with opportunities to improve their understanding. Research on outcomes of instant feedback to quizzes within a variety of teaching courses reported that instant feedback opens new avenues of communication between educators and students (Rinaldi, Lorr, & Williams, 2017). Immediate feedback is useful for students who could be struggling to understand the subject matter or who are reticent to ask questions in a conventional classroom setting. Instant feedback also provides a quick snapshot of the cohort’s understanding of the subject matter. Numerous studies identify that instant feedback recognises students’ level of understanding across a cohort providing a positive and interactive course for students (Fales-Williams, Kramer, Heer, & Danielson, 2005; James, 2016; Root-Kustritz, 2014). When immediate feedback with online tests is provided, an active learning environment is created, and students are more likely to engage with the feedback rather than avoid it if provided later (Schneider, Ruder, & Bauer, 2018). The quality and detail of the feedback will also determine the level of learning. Student learning is enhanced and reinforced when detailed feedback to online tests is provided (Wojcikowski & Kirk, 2013). Platforms—such as Moodle—are being used to embed quizzes and instant feedback into teaching courses within universities. Moodle quizzes with their facility to use numerous multimedia options such as audio and video can support this interaction and provide the immediate feedback which positively gauges the students’ level of understanding.
Student evaluation techniques and their relationship to grades have been a discussion topic for many decades in various academic disciplines (Huang & Fang, 2013; Ransdell, 2001; Ting, 2001). Relating evaluation to grades enables educators to take proactive measures in the classroom, for example, changing instruction, reviewing lecture materials and assignments, providing extra resources to students, and setting up prerequisites courses (Huang & Fang, 2013). Automated evaluation processes used in some online assessment tools such as Moodle quizzes enable instructors to identify patterns between a student’s response to a question and overall course performance using inbuilt statistical features of the platform.
Can online quizzes enhance student engagement and performance?
How can Moodle statistics be used to determine the effectiveness of online quiz questions?
What benefits do online quizzes have for academics?
The engineering course Hydraulics and Hydrology is a compulsory course for third-year undergraduate students of the civil engineering programme at the University of South Australia. The course was developed to meet Engineers Australia standards for an accredited civil engineering programme. The course content focuses on hydrological processes, measurement and interpretation of hydro-meteorological data, design flood estimations, open channel systems design and modelling river channel systems. The course consists of complex mathematical calculations usually based on Excel formula and hydrological data analysis.
Moodle quizzes were developed for this course to introduce and develop students’ declarative knowledge in engineering hydrology and open-channel hydraulics, when many university courses moved from the conventional face-to-face learning environment to online teaching. The conventional Hydraulics and Hydrology course had a 2-h lecture followed by a 2-h tutorial per week for 12 weeks (with one week set aside for revision). Seven of these weekly tutorials focused on structured problem-solving activities. When developing the online course, these seven tutorials were turned into 123 sequenced Moodle quiz questions with integrated stages. Each Moodle question consisted of text, images, audio or video explanations and feedback. Face-to-face lectures were reproduced using the Camtasia software as a series of short videos, highlighting key concepts and key terminology. These videos were then linked to the online quizzes as additional feedback to some questions. Instead of attending face-to-face lectures and tutorials, online students engage with lecturers through a real-time virtual classroom to discuss their questions. Methods used in preparing lecture videos and real-time virtual classes in the online course are not discussed in this paper.
Methods used to develop online quizzes
The process of appropriate structuring and selection of questions is vital in transferring and assessing knowledge in learning environments. In online learning environments, extra care must be taken to engage online learners. The quizzes for this online course were developed to achieve two objectives: (1) develop student knowledge by providing them with an opportunity to practise and apply the new concepts learnt in the course, and (2) assess student knowledge and assign an appropriate grade that accurately distinguish between student competency levels (novice, competent, proficient and expert). To achieve these objectives “formative” and “summative” Moodle quizzes were developed.
In this course, online quizzes were developed using various multimodal resources to explain complicated mathematical solutions and course-specific terminology. The quizzes provide an approach to construct detailed and coherent mental models of key course concepts. For example, an instructional video was prepared to teach the concept of a sluice gate (sliding gate) to control water flow where a hydraulic jump (abrupt rise in water surface) could occur in different opening positions of the sluice gate. The solution to this problem requires performing numerous calculations and drawing flow profiles. Visual representations of all scenarios aimed to help students understand the importance of each calculation when designing a sluice gate because miscalculations can lead to serious consequences. Attention was given to repetitive solutions of similar problems as research shows this technique strengthens the memory traces of appropriate schemata (Thevenot, 2010). Student prior knowledge and appropriate use of technical keywords were also considered by defining and repeating terminology.
Evaluation of the quiz questions
The quizzes have been implemented with four cohorts of students enrolled in the course (n = 169). Formative quizzes were analysed according to the number of times each student attempted each question. For the summative quiz questions, the suitability of each online question was based on psychometric analysis.
Psychometric analysis is a statistical procedure to determine the suitability of the proposed questions for grading purposes based on student responses and their relationship with the rest of the responses (Gómez-Soberón et al., 2013). As formative questions could have unlimited attempts, psychometric analysis was not suitable for analysing these questions; instead, they were analysed according to the number of times each student attempted a question. Psychometric analysis was only used to evaluate summative quizzes as it is an effective tool to assess the suitability of the questions to discriminate between competent and less competent students. The psychometric analysis in this study addressed: (1) Were the quiz questions well-constructed and did they have an appropriate level of difficulty? and (2) Did the questions discriminate between the higher and lower knowledge levels of students? The analysis of the summative quiz was carried out using the Moodle statistics of the facility index (FI) and the discrimination index (DI).
Interpretation of FI (facility index values) of each question (adapted from Moodle Statistics, 2019a)
Extremely difficult or something wrong with the question
About right for the average student
Interpretation of DI (discrimination index values) of each question (adapted from Moodle Statistics, 2019a)
50 and above
Very good discrimination
Very weak discrimination
Question probably invalid
To categorise a question with “very good discrimination”, students who scored high in other sections of the quiz should also have scored high on this question; similarly, students who have scored low on other sections of the quiz must also have scored low on this question. Hence, the score for the question and the score for the test should be well correlated.
In this study, both FI and DI are used as tools to measure the power of a question to distinguish proficient from weak learners.
Features of online quizzes which improve academics’ productivity
Having a database of resources to teach and assess course content—such as complicated mathematical calculations and course-specific terminology—which can be modified with minimal academic effort
Customised and automated feedback which provide adequate and prompt responses to students
Automated marking which reduces academics’ workload
Randomising and shuffling quiz questions, together with monitoring Moodle logs, which enable academics to address issues of plagiarism more effectively and efficiently.
These features collectively benefit time-poor academics and improve the quality of teaching.
Results and Discussion
Can online quizzes enhance student engagement and performance?
How can Moodle statistics be used to determine the effectiveness of online quiz questions?
Psychometric analysis was performed to evaluate the summative questions to identify which questions needed to be revised to achieve appropriate targets, for example, increase the overall pass rate or limit the number of students who received high distinctions (HDs). It was imperative to assess a student’s knowledge level as well as distinguish the difference in knowledge levels between students.
Further analysis of FI and DI showed no significant relationship between question types (shown in Fig. 7a) and their DI values, except for the “reflective or descriptive” type of questions. These types of questions mostly had DI index higher than 50% indicating the importance of including such questions in online quizzes. Investigating FI together with DI provided useful data to evaluate each question and correct/modify them as necessary to provide all students with a fair grade, thus optimising the effectiveness of quizzes.
What benefits do online quizzes have for academics?
The online quizzes have numerous practical benefits for academics. The first benefit is that the quiz questions were effective in the teaching and learning of a complex applied mathematics course. The quiz questions which were designed used various multimodal resources with embedded videos enabling students to visualise real-life scenarios as a preliminary to necessary problem-solving. Carefully staging questions with necessary repetition thus ensured the development of course concepts and skills. Once the questions have been designed they can be refined in an ongoing manner with minimal academic effort.
Another benefit of the online quizzes was the customised and automated feedback. The relevant immediate automated feedback for each question explained common mistakes made by students and significant time was saved by not providing repetitive feedback for common mistakes. The staged online activities significantly reduced the time that academics spent on explaining the concepts. Instead, they could spend this saved time with students during weekly helpdesk sessions via a virtual classroom, focusing on questions from students, rather than explaining the basic concepts. This change in approach from face-to-face teaching to online learning provided lecturers with vital one-on-one time with students focusing on content that needed more clarification. In saving academics time, it was therefore also cost-effective.
Similarly, automated marking in online quizzes saved significant time for academics, particularly for the Excel-based assignments. Before the introduction of these online quizzes, students used Excel formula for calculations and tutors used onerous answer sheets as a guide when grading. With the online Moodle quiz questions, students still have to practise how to use Excel for iterative type of questions, but markers do not have to use the lengthy Excel sheets. The Moodle quizzes reduced marking time as the results were calculated instantaneously for numerical and short text-type answers. This approach allowed time to be re-directed into thorough marking of reflective and descriptive questions. Tutors marked these questions and provided customised feedback, ensuring that students did not feel isolated or disenfranchised from the University.
The online quizzes also supported academics in combating plagiarism. In the previous marking practices, especially in large classes, Excel formula could be copied without being detected in text-matching software. This issue was overcome by developing the online quizzes by using Moodle’s advanced features such as randomising questions, shuffling answers and checking logs. Therefore, online quizzes have reduced workloads of academics while creating an active and engaged learning environment for students.
This paper discusses the comprehensive ability of Moodle quizzes to transfer and assess engineering knowledge based on a study conducted in an undergraduate civil engineering course at the University of South Australia. The paper explains approaches used when preparing online materials to enhance student engagement and performance. The paper highlights how Moodle statistics can be used to measure the effectiveness of quiz questions to award students a fair grade by calculating FI and DI. It also discusses the benefits for academics when using features within the Moodle quizzes.
The study found that the Moodle quizzes assessed the competencies of students during the various stages of a study period through automated marking and easily extractable statistics. These carefully prepared Moodle quizzes catered to students with different levels of knowledge within the discipline. The study identified FI and DI as control indices from which the students’ educational performance is inferred, identified and predicted by detecting whether the proposed questions are appropriate to assess the level of knowledge, degree of difficulty, and degree of discrimination between different types of conceptual skills. The study showed that the FI and DI must be used together to ascertain a reliable indication of which questions are beneficial in achieving ultimate course success. By investigating FI and DI, educators can make decisions about question selection based on their intended goal (e.g. whether they want to discriminate between students or whether they want to increase the pass rate without decreasing the quality of the course). If educators want to increase the pass rate by not sacrificing the quality of the course, then questions with high FI (≥ 66% or above) and high DI (≥ 30% or above) should be selected. The questions that require modification and additional instruction can be identified quickly by this type of analysis.
The quality of these quizzes was enhanced by using a variety of question formats with various multimodal instructions, followed by prompt feedback. The variety and combination of quiz questions increased student engagement and satisfaction as they catered to different knowledge levels of students. The variety of question formats also helped educators to balance the time required for constructing and grading the quiz (e.g. the balance between multiple choice and descriptive type of questions). This paper highlights ways to reduce grading time (e.g. not having to manually mark Excel-based assignments) and reallocate time for marking other assessments (e.g. reflective and descriptive assignments).
One limitation of this study is that the results are based on only one course, although with four cohorts. Analysing results from a few similar applied mathematics courses needs to be investigated to generalise the results. Of the various question types used in this study, most reflective and descriptive questions had a high DI, indicating the importance of including these types of questions in an applied mathematical course. All the other question types had a range of DI. However, the reasons why some questions in the same category had different DI values and the features of these questions were not investigated in this study. Further analysis of individual questions and their DI values would be a worthwhile research focus for future studies.
This study provides numerous methods that can be applied to develop effective and interactive online quizzes. The principles discussed in this paper are not limited to Moodle LMS and could be applied to other LMS systems which support online quizzes. In addition, some methods presented in this paper (e.g. how to calculate and interpret FI and DI), enable educators to use these methods in any online or offline quiz. Overall, the approach detailed in this paper is concise and flexible for online resource development. This paper will enhance mandated and non-mandated digital teaching approaches and will lead to optimising assessments to provide a dynamic and valued student learning experience.
The authors would like to thank Dr. Guna Hewa for her initial work on this study and Andrea Duff for her editorial work.
All authors contributed to the writing of the paper. SHPWG planned the research, conducted the statistical analysis and designed the online tools. JRA, MB and EJS conducted the content analysis and provided pedagogical data. The names are in order of the amount of contribution given. All authors read and approved the final manuscript.
This research received no specific grant from any funding agency.
Ethics approval and consent to participate
This research was approved by the Human Ethic Committee, University of South Australia under reference number 201281.
Consent for publication
The authors declare that they have no competing interests.
- Andre, L. (2019). 20 best LMS software solutions of 2019, FinancesOnline. All B2B. https://financesonline.com/top-20-lms-software-solutions/. Accessed 9 June 2019.
- Aydin, C. C., & Tirkes, G. (2010). Open source learning management systems in distance learning. Turkish Online Journal of Educational Technology, 9(2), 175–184.Google Scholar
- Blanco, M., & Ginovart, M. (2010). Moodle quizzes for assessing statistical topics in engineering studies. Proceedings of the the Joint International IGIP-SEFI Annual Conference, September 19-22, 2010. https://upcommons.upc.edu/bitstream/handle/2117/9992/blanco_ginovart_igip_sefi_2010.pdf. Accessed 8 June 2019.
- Broadbent, J., & Fuller-Tyszkiewicz, M. (2018). Profiles in self-regulated learning and their correlates for online and blended learning students. A bi-monthly publication of the Association for Educational Communications & Technology, 66(6), 1435–1455. https://doi.org/10.1007/s11423-018-9595-9.CrossRefGoogle Scholar
- Butcher, P. (2010). Quiz report statistics. Moodle. https://docs.moodle.org/dev/Quiz_report_statistics. Accessed 26 Feb 2019.
- eLearning Industry (2018). Top LMS software based on customer experience. https://elearningindustry.com/directory/software-categories/learning-management-systems/best/customer-experience. Accessed 9 June 2019.
- Geary, D. C. (2017). Acquisition of complex arithmetic skills and higher order mathematics concepts. London: Elsevier.Google Scholar
- Gómez-Soberón, J. M., Gómez-Soberón, M. C., Corral-Higuera, R., Arredondo-Rea, S. P., Almaral-Sánchez, J. L., & Cabrera-Covarrubias, F. G. (2013). Calibrating questionnaires by psychometric analysis to evaluate knowledge. SAGE Open, 3(3), 1–14. https://doi.org/10.1177/2158244013499159.CrossRefGoogle Scholar
- Heo, M., & Chow, A. (2005). The impact of computer augmented online learning and assessment tool. Educational Technology & Society, 8(1), 113–125.Google Scholar
- Jackson, B. L. (2017). Higher education faculty utilization of online technological tools: A multilevel analysis. Journal of Educational Multimedia and Hypermedia, 26(3), 271–283.Google Scholar
- Krause, C., Krause, R., Krause, R., Gomez, N., Jafry, Z., & Dinh, V. A. (2017). Effectiveness of a 1-hour extended focused assessment with sonography in trauma session in the medical student surgery clerkship. Journal of Surgical Education, 74(6), 968–974. https://doi.org/10.1016/j.jsurg.2017.03.007.CrossRefGoogle Scholar
- Maarek, J. (2018). Benefits of active learning embedded in online content material supporting a flipped classroom. Proceedings fo the ASEE Annual Conference & Exposition, Salt Lake City. https://peer.asee.org/29845. Accessed 14 June 2019.
- Moodle Statistics. (2019a). Moodle. https://docs.moodle.org/dev/Quiz_report_statistics. Accessed 26 Feb 2019.
- Moodle Statistics. (2019b). Moodle. https://docs.moodle.org/dev/Quiz_statistics_calculations#Introduction. Accessed 21 July 2019.
- Philipsen, B., Tondeur, J., Roblin, N. P., Vanslambrouck, S., & Zhu, C. (2019). Improving teacher professional development for online and blended learning: A systematic meta-aggregative review. Educational Technology Research and Development, 1–30. https://doi.org/10.1007/s11423-019-09645-8.
- Salamon, H., Ali, N., Miskon, S., & Ahmad, N. (2016). Initial recommendations of MOOCs characteristics for academic discipline clusters. Journal of Theoretical and Applied Information Technology, 87(2), 204–213.Google Scholar
- Thevenot, C., Devidal, M., Barrouillet, P., & Fayol, M. (2007). Why does placing the question before an arithmetic word problem improve performance? A situation model account. Quarterly Journal of Experimental Psychology, 60(1), 43–56. https://doi.org/10.1080/17470210600587927.CrossRefGoogle Scholar
- Ting, S. M. R. (2001). Predicting academic success of first-year engineering students from standardized test scores and psychosocial variables. International Journal of Engineering Education, 17(1), 75–80.Google Scholar
- Wallihan, R., Smith, K. G., Hormann, M. D., Donthi, R. R., Boland, K., & Mahan, J. D. (2018). Utility of intermittent online quizzes as an early warning for residents at risk of failing the pediatric board certification examination.(Report). BMC Medical Education, 18, 1. https://doi.org/10.1186/s12909-018-1366-0.CrossRefGoogle Scholar
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.