Designing Authentic Assessments: Engaging Business Students in Flow Experience with Digital Technologies
- 62 Downloads
Embracing technology in higher education has become a necessity, rather than a desired value-added addition to educational delivery.
Embracing technology in higher education has become a necessity, rather than a desired value-added addition to educational delivery. Previously, innovative technological adoption was the result of academic dissatisfaction of the current status quo (Ely, 1990), and/or the push from institutional leaders and the financial benefits that technology can bring through increased student volumes, enhanced reputation and competitive advantage (Price & Kirkwood, 2013). More recently, particularly with the COVID-19 pandemic, academics are now being forced to engage with digital technologies and consider the transformational benefits that technology can potentially bring to the online classroom. More than ever, academics are faced with addressing the long-standing calls for innovative student-centric and project-based teaching practices rather than using technology to merely sustain, ‘replicate or supplement traditional activities’ (Price & Kirkwood, 2013, p. 8). To date, there is still a void in the uptake of digital innovations in education, which is arguably confounded by performance pressures, competing priorities, financial and time constraints (Brimble, 2016; Harper et al., 2019). As a result, issues relating to poor curriculum design and academic integrity continue to exist.
The contention for this chapter is that innovative authentic digital assessment designs can contribute to relieving key pressure points such as last-minute assessment preparation, poor-quality assessment designs and submitted work, minimal opportunities for regular academic engagement and feedback and minimising the growing impact of plagiarism and contract cheating in higher education (Bretag et al., 2016; Harper et al., 2019). It is argued that contract cheating can be minimised by providing students with a learning environment that motivates them not to cheat, by providing them with personalised and sequential assessment designs that encourage them to realise the intrinsic (skills) and extrinsic (work ready) motivation that engaging in the assessment will provide (Bretag et al., 2016; Harper et al., 2019; Walker & Townley, 2012). Most importantly, digital innovations allow for more ‘individualisation of learning’ and enhances the development of twenty-first century skills of ‘independent learning, initiative, communication, teamwork, adaptability, collaboration, networking, and thinking skills within a particular professional or subject domain’ (Bates & Sangrà, 2011, p. xxi). Individual learning can be better managed through coaching and scaffolding with assessment appearing to be seamlessly integrated with the learning activity (Herrington & Standen, 1999; Lameras et al., 2017). If the innovative environment is engaging enough, success can be measured by the extent to which the immersive experience is described by students in terms of their flow experience, that is, the ‘holistic sensation that people feel when they act with total involvement’ and lose a sense of time (Csikszentmihalyi, 1990, p. 477). The flow experience occurs when learners experience cognitive efficiency, are intrinsically motivated and happy (Csikszentmihalyi, 1975).
The broad research question investigated in this chapter is the extent to which digital assessment design features can reduce the cognitive load burden of students and accomplish flow experience.
In the sections that follow, contributions to the authentic assessment literature, underpinned by cognitive load and flow theories, are provided. This is followed by the design and pilot testing of a digital, scaffolded assessment tool intended to provide an immersive learning environment and reduce the cognitive burden of higher education business students. Survey data is used to determine the topics students find most challenging. The survey results guide the direction taken in the report-writing journey, whereby students are required to propose a ‘big idea’ which is linked to improved performance and includes the achievement of United Nations Sustainable Development Goals (UN, 2018). In the design sections, we explain the educator role in the technologically based management world with the ability to provide individualised, real-time feedback to students. The platform designs ensure that academics and students engage in regular conversations through a system that is dynamic and adaptable to new queries, topics and assessment formats (including written, graphics and numerical). The system can be used to motivate students across different educational settings and time zones. In the findings section, we discuss the analysis of the secondary data from undergraduate business students undertaking a large core business course and use this data to evaluate the impact of the digital platform on student flow experience. We conclude the chapter with a discussion of limitations and insights for further research in this area.
Innovative Scaffolded Digital Designs to Achieve Flow Experience: Cognitive Load Theory
According to cognitive load theory, a scaffolded approach to learning results in cognitive efficiency and reduces the cognitive load burden (Sweller, 1988). These factors are particularly important for students facing challenging assessment topics when they could easily give up. As such, the digital environment provides a perfect setting for scaffolding learning and examining flow experience in education (Annetta, 2010; Giasiranis & Sofos, 2017; Shin, 2006), including distance education (Liao, 2006; Pearce, Ainley & Howard, 2005). The ability for students to comprehend the individual schema in scaffolded designs is an important part of pedagogical designs (Sweller, Van Merriënboer & Paas, 1998). These important design features provide educators with more informed learning analytics. They also contribute to the observable flow experience associated with cognitive efficiency (Annetta, 2010).
Cognitive load theory comprises three parts: intrinsic, extraneous and germane cognitive loads. Intrinsic cognitive load is the inherent difficulty level of the specific topic or the complexity that emerges from dealing with a number of elements that must be processed at the same time in a learner’s working memory (Gerjets & Scheiter, 2003). Importantly, the inherent difficulty level of a specific topic cannot be changed (e.g., higher education maths compared with primary school mathematics). As such, the ability of the learner to break down the components into manageable schema depends on the topic and the learner’s expertise (Sweller’s et al., 1998). Educational designs can support learning by breaking down the complex topic into schema or subtopics before combining them back together for final, holistic understanding (Sweller et al., 1998).
Extraneous cognitive load relates to the knowledge seeking of individual learners due to ineffective instructional techniques. Learners are required to tap into their cognitive resources or working memory for additional information to support learning (Sweller, 1994). Extraneous cognitive load is brought into play when learners are required to expend their cognitive resources by searching for information (i.e., internet, other resources, guides and instructions) that is needed to complete a learning task (Paas, Renk & Sweller, 2003). This may fail if learners have limited cognitive resources to utilise. It also means that the more the extraneous cognitive resources are utilised, the less cognitive resources are available for schema construction or automation. Hence, learning the topic becomes more difficult, and intrinsic load schema construction less possible. Nevertheless, if learning materials are suitably designed, extraneous cognitive load is reduced, and more resources can subsequently be allocated to process the intrinsic cognitive load. Students may feel baffled if the schema has not been constructed and/or available for them to access (Sweller, 1994).
However, when sufficient working memory resources remain after the intrinsic and extraneous cognitive load processing, learners may expend additional efforts in value-added processes which are related to learning, such as schema construction (knowledge formation). This is referred to as the germane cognitive load. Germane cognitive load is the desired or effective cognitive load, the result of beneficial cognitive processes such as abstractions and elaborations that are promoted by instructional presentation (Gerjets & Scheiter, 2003). When sufficient working memory resources remain, after the intrinsic and extraneous cognitive load processing, learners may expend additional efforts in value-added processes which are related to learning, such as schema construction. These processes also increase cognitive load, but it is only germane cognitive load that will contribute to, instead of interfering with, learning (Sweller et al., 1998). Germane load is described as the mental resource learners use to learn and conceptualise ideas-schemata. Promoting germane load may enhance learning performance.
Cognitive feedback is facilitated by digital technologies and plays an important role in capturing learner’s attention and focusing it on the essential schema (Ketamo & Kiili, 2010). Instructional designers can support the working memory of learners by reducing extraneous cognitive load and wasteful effort. This leaves learners with the capacity to invest in their own resources, such as constructing mental maps, or other advanced cognitive processing techniques associated with germane cognitive load (Gerjets & Scheiter, 2003). Debue and Leemput (2014) confirm when the extraneous load is reduced, such as through animation and pictures, the germane load increases, and learner performance improves. Early feedback has also been found to enable reflexive development and validation of mental models, along with the effective formation of new pedagogically informed strategies (Ketamo & Kiili, 2010; Bellotti et al., 2011).
Feeling the activity can be successfully completed,
The player can concentrate fully on the activity,
The activity has clear goals,
The activity provides fast feedback,
The player is deeply involved in the activity,
A sense of control over the actions is necessary to perform the activity,
Self-awareness disappears during flow, and
There is an altered sense of time.
Flow experience in digital designs has been studied, for example, in immersive technologies such as virtual reality (Giasiranis & Sofos, 2017) and distance learning (Liao, 2006; Shin, 2006). Liao (2006) found positive relationships between the learner and instructor and digital interface in flow experience. Shin (2006) similarly examined the flow effect in an online virtual course and found that student perceptions of levels of skills and challenges are critical to determining the level of flow. These authors confirm that teacher role and designed learning pedagogy are important in impacting the cognitive burden of students, thus contributing to improved flow effects. Shin (2006) also found a relationship between flow and student satisfaction scores. Together these findings provide the impetus for this study to examine whether our individual authentic assessment design can similarly achieve flow experience for our business accounting students. Furthermore, Pearce et al. (2005) flow process helps to understand that an examination of the alternative paths students take when dealing with challenging concepts requires specific skills. We address their calls for examining the interactions required so students can navigate the ‘challenge-skill space’ on the way towards the flow experience.
The built ‘assessment’ artefact is designed according to the cognitive load and flow theories with attention to immersion, interactivity, increasing complexity, informed teaching and instructional design. These elements are achieved by attention to the assessment storyline, developed as a result of a short survey of students about the course topics they found cognitively challenging. Python coding contributed to scaffolding the assessment design.
Survey to Inform Storyline Design
Organisational Strategy and Management Accounting Control Systems;
Organisational Structure and Responsibility Centre;
Budgeting and the Strategic Management of Costs and Revenues;
Behavioural Approaches to Budgeting;
Static and Flexible Budgets;
Budgeting and Market Responses;
Performance Measurement and Reward Systems;
Non-Financial Performance-Strategy Maps and the Balanced Scorecard;
From 858 students, we received 283 responses, representing a 33% response rate. Results indicated that they found a qualitative balanced scorecard and non-financial performance evaluation (9) and risk management (10), along with a quantitatively challenging topic transfer pricing (3). We used this final year student data to redesign the first-year course, to ensure our immersive digital designs would contribute to an overall scaffolded approach to learning. By breaking down the complexity of the balanced scorecard to a practical ‘immersive’ experience, we were hoping that by the time our students faced third year, they would find this topic less cognitively challenging. We also added quantitative items in the digital design that would later build to the more complex transfer pricing topics experienced in their third-year courses.
The tailored storyline is designed to promote authentic assessment and foster individual creativity by engaging business students in developing their own business plan for an idea that would not only improve business performance but also address the broader impacts on the United Nations Sustainable Development Goals (SDGs). Students were required to select their own avatar (or perspective) that they take when they pitch their business case to generate interest and potential funding to help realise the idea in practice. They might be an owner of a company, marketing manager, supply chain manager, CEO, CFO, etc., giving them more autonomy in the project itself and contributing to the flow experience and level of ‘immersion’ in the topic. By encouraging them to be innovative with their idea, this also contributes to achieving flow experience as measured by Annetta (2010).
The extent of interactivity was limited to the interactions between the student and their teacher at each designated stage of the project. The system administrator has the capacity to intervene, provide more instructions, change questions, set deadlines as well as hide and lock cells. In this example, the idea section was ‘locked’ after a designated deadline to ensure the rest of the business report remains unique to the idea, and students cannot deviate from their original plan. This was decided as we wanted students to engage with and ‘own’ their own idea from the beginning to end. Locking the template ensured students could not adjust their idea to make it ‘easier’ for themselves or provide an easier avenue for plagiarism or contract cheating. We considered the major project as an exam equivalent, hence posed these stricter conditions.
Increasing complexity of the project was an important part of the design with the complexity increasing as the students learned the topics in class. They were required to begin to pitch their idea in terms of explaining how it would contribute to society (economically, environmentally and socially). Then they were required to use the SDGs to determine how performance would be measured and evaluated. Next, they had to consider the costs associated with bringing the idea to fruition and the template was coded to use randomly generated numbers for individual students to calculate cost-volume-profit (CVP) and financial performance in terms of ratio analysis. Because of the student-entered qualitative data and the randomly assigned input information, requiring different calculations, every assessment piece was unique, requiring individualised written interpretations. Students are unable to anticipate questions, as these are blocked from view until the designated time. We did not allow students to change their previous inputs without staff feedback and intervention. The entire course content is matched with topic content, so learning and assessment can be managed in staged developments. Informed teaching and regular feedback are designed to contribute to the immersive learning experience.
Digital Artefact Design
This greater flexibility was designed to support work–life balance and other challenges students face getting their work done in a timely manner. The dynamic interface embeds instructions for students. As soon as similar queries emerge through email queries, this can be answered by the administrator (course coordinator) entering direct edits in the interface to ensure all students are clear about the instructions and goals to be achieved. This dynamic feedback and adjustment are essential in large course delivery and also contribute to the well-being of the teaching staff, who do not have to repeat instructions over and over. Even if students are provided updated details on the Learning Management System (LMS), our experience is that students tend not to search for clarification updates on the LMS but go directly to their instructors. By updating the system, the students avoid the cognitive extraneous effort or ‘search’ for additional information to support and confirm understanding.
Figure 3.6 shows how the feedback is displayed to students. In the WritePal system, the student will be able to review instant feedback for each question and the score for the question. Teachers have the option to set a designated feedback release date or result release date. The multiple deadlines are clearly shown on the right-hand side with upcoming deadlines shown in red as a reminder. Recall that students cannot change their previous inputs without staff feedback and intervention, unlike the traditional assessment which could be easily sent to a contract writer to complete.
Figure 3.7 provides an overview of the teacher’s assignment management portal. In this portal, teachers can add courses, semesters and seminars; set up questions; add users; give extensions; and review student’s responses. Figure 3.8 displays how a teacher reviews, provides feedback and checks for marking memo. On the left side of the figure is the student response. The system is set for teachers to select a question type. For example, refer to Q9 (Fig. 3.7). This question requires students to upload an image of their company value chain. In the middle of the figure are teacher comments and scores. In the right of the figure is a marking guideline memo providing instructions to staff. Figure 3.9 displays the interface for markers to review how many papers/questions are unfinished. Given the assignment is staged, it is important to show how many questions have been answered and how many questions remained ungraded so teaching staff have clear instructions. As indicated in Fig. 3.8, the interface clearly displays student information and the marked and unmarked questions.
The system also offers both teachers and students opportunities to generate and download the full report by clicking the ‘report’ button when necessary. For students, this button is made visible once the report is completed.
Data Collection and Analysis
Data was collected in accordance with RMIT ethics guidelines. Survey data was collected from the final year on topics they found challenging. This initial data was used to inform the digital artefact design. We then pilot tested the digital artefact in our first-year accounting course, which comprises students undertaking business degrees across a number of major programs including accounting, finance, economics, law, management marketing, supply chain and logistics. This is a large course with enrolments ranging 1200–1900 per semester.
Secondary data from the course experience survey and comments from teachers are used to evaluate the flow experience of our students undertaking assessment through the digital artefact. We used Annetta’s (2010) eight elements of flow for immersive digital designs to analyse the data. The standard questions on the course experience survey ask students to comment on what is best about the course as well as what they think should be improved. Because the digital artefact was being used as a formal assessment piece, we did not want the potential for biased feedback; hence the decision to evaluate what emerged ‘unsolicited’ from the students at the end of the semester. We did not ask any specific or additional questions about this assessment piece.
We also used data in relation to the number of student assignment extension requests, along with direct evidence from the teacher interface, to determine whether students believe the task can be successfully completed. The other items are gathered from themed analysis of the qualitative responses and evidence presented through the unsolicited student feedback.
We received positive feedback from both students and staff. We had 979 students and 13 educators teaching in the course. Throughout the semester and staged use of the digital artefact, both staff and students agreed that the interface is easy to use. Students engaged with the template and enjoyed the continuous feedback and ability to adjust their responses and build on them based on weekly topic content. Of the 979 students, we received qualitative comments from 194 students (20% response rate) for the question ‘what is the best part of this course’. Of the responses, 16% explicitly mentioned the digital assessment was what they enjoyed the most and 28% of the responses indicated that they really liked the staged assignment. The course received the highest overall satisfaction rating and good teaching scores on record for a common core course. We believe the following discussion provides evidence of the flow experience of our large student cohort. While we cannot give a definitive measure of flow experience for every student, the following qualitative evidence helps to support that flow was achieved.
I found the assignment very approachable (individual business report), it gave a step by step guideline of what it is expected and what is needed giving a feeling that I am being provided assistance along the way. I did not find any difficulty doing the assignment, which makes me feel motivated in this course. (student CES response)
I love the assessment layout for the report. Easy to use, easy to meet deadlines, not too much to stress over and it encourages me to get it all done early rather than procrastinating and leaving doing the whole report till the last minute and stressing out the day before it’s due. I wish every course could adopt this style. Unlike all my other courses, I’ve never had a panic attack when working on this assignment, especially after procrastinating, thank you so much. (student CES response)
......Courage to try new ideas in order to help and improve student engagement. The new system used to incrementally complete our individual business report is carefully designed for students in mind, and I believe it is much better than a standard assignment of completing everything by a due date. It is also fairly user friendly. (student CES response)
In evidencing that (4) The activity provides fast feedback, at this stage of the project design—we can only provide evidence of teacher interaction. If the digital elements are further enhanced to provide automated feedback to students (for example, through algorithms, Bots and AI), we can address this area further. At this stage, we are working with a simplified digital design. During one stage of the question release, we received two (2) emails simultaneously from students asking the same question. We immediately responded to the students directly and updated the instructions in the digital template and this stopped all further questions in relation to clarifying the goals of that specific activity.
It is kind of interesting. I like assignment 2 because it allows for freedom and creativity (student CES response) and,
The assignment where students are able to use their interests in making a business plan, which helps with motivation and connection to the content (student CES response).
I really liked how the report was a staged submission. It allowed me to focus on a part at a time and do my best work. This also ensured I didn’t leave things to the last minute. (student CES response)
…, the individual assignment was a good idea to have going through the semester, with learning content so you can practise the stuff you learn while moving through. (student CES response)
It’s great that the Business Plan assignment was done in stages as it made it less stressful to complete. (student CES response)
The business report was the best aspect because it allowed us to sequentially submit parts of our assignment. This meant that we weren’t stressed about completing the whole assignment by the due date, but rather focus on certain aspects and spend time on each part. I was really motivated to do this business report and it’s the first time I enjoyed doing a report. (student CES response)
…Digital learning and assessments particularly the online system that was developed for the business report is making students in-control as they feel self-confident and independent… (academic peer evaluation response)
The final elements of flow—(7) Self-awareness disappears during flow and (8) There is an altered sense of time—are difficult to claim evidence without directly asking or observing the students. The data visualisations from all positive and negative CES responses indicate that the most dominant word is ‘engaging’, meaning that overall the students were satisfied with their course experience. While the other dominant words ‘structured’ and ‘deadlines’ that were clearly aligned with the digital artefact also could be viewed as negative, a more detailed analysis of the comments around these words indicates that the students enjoyed the formal way they were navigated through the system. Nevertheless, this also indicates that the scaffolded, layered deadlines evident in the course design potentially are a trade-off to the latter two flow elements.
The game aspect was really engaging. The teaching team was extremely helpful and friendly. The course content itself was not something I thought I would enjoy however it was super interesting and taught me skills and concepts that seem useful for my future as a possible employee or employer. (student CES response)
The findings contributed to confirming that the implementation of a scaffolded, staged approach not only contributed to the flow experience of students but also contributed to their cognitive load.
Discussion, Conclusion and Limitations
The digital artefact was a relatively successful pilot experiment built on cognitive load and flow experience literature. We consider success in terms of meeting the flow experience criteria designated by the psychology literature dealing with immersive technologies (Annetta, 2010; Pearce et al., 2005) and contribute to the emerging but minimal literature in this area (Bitrián et al., 2020). While we were able to describe many of the eight (8) flow criteria, some of the elements were harder to directly evidence, thus requiring further exploration in future research initiatives. We also relied on secondary data, and unprompted qualitative responses to measure student flow experience. In recognising this as a potential limitation to the study, further evidence, through targeted surveys and interviews, is recommended.
To date, the system appears to minimise plagiarism and contract cheating problems identified by Bretag et al. (2019). This is arguably due to the personalised, unique and progressively released questions that build on the previous inputs. While we cannot be definitive, the ability for contract cheating is harder when assessments are staged and not all questions are made available at the outset. The most important aspect of the system is that it can continually be adapted to new queries, topics and new assessment formats (written, graphics, numerical, etc.). This means the system is transferable across semesters, courses and educational disciplines.
The flow experience, an important part of understanding the degree of engagement and immersion with the digital artefact with the benefit of this dynamic digital artefact, is that other aspects, such as identity and interactivity, can be developed further and tested in new iterations of this digital assessment design. The elements of increasing complexity are important in addressing the ability of the digital design to address the cognitive loads of students when being introduced to challenging topics (Sweller, 1988). Likewise, the pedagogical design is in accordance with informed teaching whereby the teacher can play an active role in engaging in the student journey. The findings indicated that this was made possible. We focused more on the student experience, and further research would provide more insights, particularly from the teacher perspective. The instructional design was evident in the student responses, which directly links to the formality of the scaffolded approach identified as important in the cognitive load theory literature. However, this was not exploited as much as the emerging flow literature in digital pedagogy would expect. Further iterations of the digital artefact could also engage with more gamified elements such as rewards, badges and leader boards. We also recommend further longitudinal research that follows the first-year students through to their third year, to determine whether this first-year assessment experience contributed to long-term germane, cognitive benefit. We could also give the same digital assignment to different student groups, test different design features, such as staged deadlines, as well as address the cognitive load benefits between the different experimental groups.
In conclusion, we consider that the digital artefact has contributed to relieving key pressure points for both academics and students including last-minute assessment preparation, poor-quality assessment designs and submitted work, plagiarism and contract cheating, minimal opportunities for regular academic engagement, and feedback and overall well-being concerns.
- Annetta, L. A. (2010). The “I’s” have it: A framework for serious educational game design. Review of General Psychology, 14(2), 105–112.Google Scholar
- Bates, A., & Sangrà, A. (2011). Managing technology in higher education: Strategies for transforming teaching and learning. San Francisco: Jossey-Bass/John Wiley & Co.Google Scholar
- Bellotti, F., Ott, M., Arnab, S., Berta, R., de Freitas, S., Kiili, K., … De Gloria, A. (2011). Designing serious games for education: from pedagogical principles to game mechanisms. In The 5th European Conference on Games Based Learning (pp. 26–34). Greece: University of Athens.Google Scholar
- Bitrián, P., Buil, I., & Catalán, S. (2020). Flow and business simulation games: A typology of students. The International Journal of Management Education, 18(1), 100–365.Google Scholar
- Bretag, T., Mahmud, S., Wallace, M., Walker, R., James, C., Green, M., et al. (2016). Core elements of exemplar academic integrity policy in Australian higher education. International Journal for Educational Integrity, 7(2), 3–12.Google Scholar
- Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., van Haeringen, K., et al. (2019). Contract cheating and assessment design: Exploring the relationship. Assessment & Evaluation in Higher Education., 44(5), 676–691.Google Scholar
- Brimble, M. (2016). Why students cheat. In An exploration of the motivators of student academic dishonesty in higher education (pp. 1–14). Handbook of Academic Integrity.Google Scholar
- Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. San Francisco: Jossey-Bass.Google Scholar
- Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Journal of Leisure Research, 24(1), 93–94.Google Scholar
- Debue, N. & Leemput C. V. D. (2014) What does germane load mean? An empirical contribution to the cognitive load theory. Frontiers in Psychology.Google Scholar
- Ely, D. P. (1990). Conditions that facilitate the implementation of educational technology innovations. Journal of Research on Computing in Education, 23(2), 298–305.Google Scholar
- Gerjets, P., & Scheiter, K. (2003). Goal configurations and processing strategies as moderators between instructional design and cognitive load: Evidence from hypertext-based instruction. Educational Psychologist, 38(1), 33–41.Google Scholar
- Giasiranis, S., & Sofos, L. (2017). Flow experience and educational effectiveness of teaching informatics using AR. Educational Technology & Society, 20(4), 78–88.Google Scholar
- Harper, R., Bretag, T., Ellis, C., Newton, P., Rozenberg, P., Saddiqui, S., et al. (2019). Contract cheating: A survey of Australian university staff. Studies in Higher Education, 44(11), 1857–1873.Google Scholar
- Herrington, J., & Standen, P. (1999). Moving from an instructivist to a constructivist multimedia learning environment. In B. Collis & R. Oliver (Eds.), In ED-MEDIA 1999–World Conference on Educational Multimedia, Hypermedia & Telecommunications (pp. 132–137). Seattle, WA USA: Association for the Advancement of Computing in Education (AACE).Google Scholar
- Jackson, S. A., & Marsh, H. (1996). Development and validation of a scale to measure optimal experience: The flow state scale. Journal of Sport & Exercise Psychology., 18(1), 17–35.Google Scholar
- Ketamo, H., & Kiili, K. (2010). Conceptual change takes time: game based learning cannot be only supplementary amusement. Journal of Educational Multimedia and Hypermedia, 19(4), 399–419.Google Scholar
- Lameras, P., Arnab, S., Dunwell, I., Stewart, C., Clarke, S., & Petridis, P. (2017). Essential features of serious game design in higher education: Linking learning attributes to game mechanics. British Journal of Educational Technology, 48(4), 972–994.Google Scholar
- Liao, L. F. (2006). A flow theory perspective on learner motivation and behavior in distance education. Distance Education, 27, 45–62.Google Scholar
- Paas, F., Renk, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4.Google Scholar
- Pearce, J. M., Ainley, M., & Howard, S. (2005). The EBB and flow of online learning. Computers in Human Behavior, 21(5), 745–771.Google Scholar
- Price, L., & Kirkwood, A. (2013). Using technology for teaching and learning in higher education: A critical review of the role of evidence in informing practice. Higher Education Research and Development, 33(3), 549–564.Google Scholar
- Shin, N. (2006). Online learner’s “Flow” experience: An Empirical study. British Journal of Educational Technology, 37(5), 705–720.Google Scholar
- Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 139–297.Google Scholar
- Sweller, J. (1994). Cognitive load theory, learning difficulty and instructional design. Learning and Instruction., 4(4), 295–312.Google Scholar
- Sweller, J., Merrienboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.Google Scholar
- United Nation (UN). (2018). Sustainable Development Goals, United Nations Department of Public Information. https://sustainabledevelopment.un.org/?menu=1300.
- van Schaik, P., Martin, S., & Vallance, M. (2012). Measuring flow experience in and immersive virtual environment for collaborative learning. Journal of Computer Assisted Learning, 28(4), 350–365.Google Scholar
- Walker, M., & Townley, C. (2012). Contract cheating: A new challenge for academic honesty. Journal of Academic Ethics, 10(1), 27–44.Google Scholar