Advertisement

Designing Authentic Assessments: Engaging Business Students in Flow Experience with Digital Technologies

  • Viktor ArityEmail author
  • Gillian Vesty
Chapter
  • 62 Downloads

Abstract

Embracing technology in higher education has become a necessity, rather than a desired value-added addition to educational delivery.

Introduction

Embracing technology in higher education has become a necessity, rather than a desired value-added addition to educational delivery. Previously, innovative technological adoption was the result of academic dissatisfaction of the current status quo (Ely, 1990), and/or the push from institutional leaders and the financial benefits that technology can bring through increased student volumes, enhanced reputation and competitive advantage (Price & Kirkwood, 2013). More recently, particularly with the COVID-19 pandemic, academics are now being forced to engage with digital technologies and consider the transformational benefits that technology can potentially bring to the online classroom. More than ever, academics are faced with addressing the long-standing calls for innovative student-centric and project-based teaching practices rather than using technology to merely sustain, ‘replicate or supplement traditional activities’ (Price & Kirkwood, 2013, p. 8). To date, there is still a void in the uptake of digital innovations in education, which is arguably confounded by performance pressures, competing priorities, financial and time constraints (Brimble, 2016; Harper et al., 2019). As a result, issues relating to poor curriculum design and academic integrity continue to exist.

The contention for this chapter is that innovative authentic digital assessment designs can contribute to relieving key pressure points such as last-minute assessment preparation, poor-quality assessment designs and submitted work, minimal opportunities for regular academic engagement and feedback and minimising the growing impact of plagiarism and contract cheating in higher education (Bretag et al., 2016; Harper et al., 2019). It is argued that contract cheating can be minimised by providing students with a learning environment that motivates them not to cheat, by providing them with personalised and sequential assessment designs that encourage them to realise the intrinsic (skills) and extrinsic (work ready) motivation that engaging in the assessment will provide (Bretag et al., 2016; Harper et al., 2019; Walker & Townley, 2012). Most importantly, digital innovations allow for more ‘individualisation of learning’ and enhances the development of twenty-first century skills of ‘independent learning, initiative, communication, teamwork, adaptability, collaboration, networking, and thinking skills within a particular professional or subject domain’ (Bates & Sangrà, 2011, p. xxi). Individual learning can be better managed through coaching and scaffolding with assessment appearing to be seamlessly integrated with the learning activity (Herrington & Standen, 1999; Lameras et al., 2017). If the innovative environment is engaging enough, success can be measured by the extent to which the immersive experience is described by students in terms of their flow experience, that is, the ‘holistic sensation that people feel when they act with total involvement’ and lose a sense of time (Csikszentmihalyi, 1990, p. 477). The flow experience occurs when learners experience cognitive efficiency, are intrinsically motivated and happy (Csikszentmihalyi, 1975).

The broad research question investigated in this chapter is the extent to which digital assessment design features can reduce the cognitive load burden of students and accomplish flow experience.

In the sections that follow, contributions to the authentic assessment literature, underpinned by cognitive load and flow theories, are provided. This is followed by the design and pilot testing of a digital, scaffolded assessment tool intended to provide an immersive learning environment and reduce the cognitive burden of higher education business students. Survey data is used to determine the topics students find most challenging. The survey results guide the direction taken in the report-writing journey, whereby students are required to propose a ‘big idea’ which is linked to improved performance and includes the achievement of United Nations Sustainable Development Goals (UN, 2018). In the design sections, we explain the educator role in the technologically based management world with the ability to provide individualised, real-time feedback to students. The platform designs ensure that academics and students engage in regular conversations through a system that is dynamic and adaptable to new queries, topics and assessment formats (including written, graphics and numerical). The system can be used to motivate students across different educational settings and time zones. In the findings section, we discuss the analysis of the secondary data from undergraduate business students undertaking a large core business course and use this data to evaluate the impact of the digital platform on student flow experience. We conclude the chapter with a discussion of limitations and insights for further research in this area.

Innovative Scaffolded Digital Designs to Achieve Flow Experience: Cognitive Load Theory

According to cognitive load theory, a scaffolded approach to learning results in cognitive efficiency and reduces the cognitive load burden (Sweller, 1988). These factors are particularly important for students facing challenging assessment topics when they could easily give up. As such, the digital environment provides a perfect setting for scaffolding learning and examining flow experience in education (Annetta, 2010; Giasiranis & Sofos, 2017; Shin, 2006), including distance education (Liao, 2006; Pearce, Ainley & Howard, 2005). The ability for students to comprehend the individual schema in scaffolded designs is an important part of pedagogical designs (Sweller, Van Merriënboer & Paas, 1998). These important design features provide educators with more informed learning analytics. They also contribute to the observable flow experience associated with cognitive efficiency (Annetta, 2010).

Cognitive load theory comprises three parts: intrinsic, extraneous and germane cognitive loads. Intrinsic cognitive load is the inherent difficulty level of the specific topic or the complexity that emerges from dealing with a number of elements that must be processed at the same time in a learner’s working memory (Gerjets & Scheiter, 2003). Importantly, the inherent difficulty level of a specific topic cannot be changed (e.g., higher education maths compared with primary school mathematics). As such, the ability of the learner to break down the components into manageable schema depends on the topic and the learner’s expertise (Sweller’s et al., 1998). Educational designs can support learning by breaking down the complex topic into schema or subtopics before combining them back together for final, holistic understanding (Sweller et al., 1998).

Extraneous cognitive load relates to the knowledge seeking of individual learners due to ineffective instructional techniques. Learners are required to tap into their cognitive resources or working memory for additional information to support learning (Sweller, 1994). Extraneous cognitive load is brought into play when learners are required to expend their cognitive resources by searching for information (i.e., internet, other resources, guides and instructions) that is needed to complete a learning task (Paas, Renk & Sweller, 2003). This may fail if learners have limited cognitive resources to utilise. It also means that the more the extraneous cognitive resources are utilised, the less cognitive resources are available for schema construction or automation. Hence, learning the topic becomes more difficult, and intrinsic load schema construction less possible. Nevertheless, if learning materials are suitably designed, extraneous cognitive load is reduced, and more resources can subsequently be allocated to process the intrinsic cognitive load. Students may feel baffled if the schema has not been constructed and/or available for them to access (Sweller, 1994).

However, when sufficient working memory resources remain after the intrinsic and extraneous cognitive load processing, learners may expend additional efforts in value-added processes which are related to learning, such as schema construction (knowledge formation). This is referred to as the germane cognitive load. Germane cognitive load is the desired or effective cognitive load, the result of beneficial cognitive processes such as abstractions and elaborations that are promoted by instructional presentation (Gerjets & Scheiter, 2003). When sufficient working memory resources remain, after the intrinsic and extraneous cognitive load processing, learners may expend additional efforts in value-added processes which are related to learning, such as schema construction. These processes also increase cognitive load, but it is only germane cognitive load that will contribute to, instead of interfering with, learning (Sweller et al., 1998). Germane load is described as the mental resource learners use to learn and conceptualise ideas-schemata. Promoting germane load may enhance learning performance.

Cognitive feedback is facilitated by digital technologies and plays an important role in capturing learner’s attention and focusing it on the essential schema (Ketamo & Kiili, 2010). Instructional designers can support the working memory of learners by reducing extraneous cognitive load and wasteful effort. This leaves learners with the capacity to invest in their own resources, such as constructing mental maps, or other advanced cognitive processing techniques associated with germane cognitive load (Gerjets & Scheiter, 2003). Debue and Leemput (2014) confirm when the extraneous load is reduced, such as through animation and pictures, the germane load increases, and learner performance improves. Early feedback has also been found to enable reflexive development and validation of mental models, along with the effective formation of new pedagogically informed strategies (Ketamo & Kiili, 2010; Bellotti et al., 2011).

As learners become immersed in the project, their active participation transfers to passive participation as they become completely absorbed in the task at hand. The flow experience concept has been measured (Csikszentmihalyi, 1975, 1990) and used to test immersive learning experiences (Annetta, 2010; Bitrián, Buil & Catalan, 2020; Giasiranis & Sofos, 2017; Jackson & Marsh, 1996; Liao, 2006; Pearce et al., 2005; Shin, 2006; van Schaik, Martin & Vallance, 2012). Drawing on the earlier flow theories (Csikszentmihalyi, 1975; Jackson & March, 1996) and adapting to digital immersive learning designs, Annetta (2010, p. 107) defines flow as
  1. 1.

    Feeling the activity can be successfully completed,

     
  2. 2.

    The player can concentrate fully on the activity,

     
  3. 3.

    The activity has clear goals,

     
  4. 4.

    The activity provides fast feedback,

     
  5. 5.

    The player is deeply involved in the activity,

     
  6. 6.

    A sense of control over the actions is necessary to perform the activity,

     
  7. 7.

    Self-awareness disappears during flow, and

     
  8. 8.

    There is an altered sense of time.

     

Flow experience in digital designs has been studied, for example, in immersive technologies such as virtual reality (Giasiranis & Sofos, 2017) and distance learning (Liao, 2006; Shin, 2006). Liao (2006) found positive relationships between the learner and instructor and digital interface in flow experience. Shin (2006) similarly examined the flow effect in an online virtual course and found that student perceptions of levels of skills and challenges are critical to determining the level of flow. These authors confirm that teacher role and designed learning pedagogy are important in impacting the cognitive burden of students, thus contributing to improved flow effects. Shin (2006) also found a relationship between flow and student satisfaction scores. Together these findings provide the impetus for this study to examine whether our individual authentic assessment design can similarly achieve flow experience for our business accounting students. Furthermore, Pearce et al. (2005) flow process helps to understand that an examination of the alternative paths students take when dealing with challenging concepts requires specific skills. We address their calls for examining the interactions required so students can navigate the ‘challenge-skill space’ on the way towards the flow experience.

The theoretical framework that underpins the design of the artefact and data collection approach is outlined in Fig. 3.1. It is argued that the scaffolded approach to learning design will lead to a reduction in the cognitive load burden for students. This approach reduces the intrinsic load supported by extraneous design features. We contend that because students feel more comfortable with the task at hand, and engage more readily with design features, this will contribute to their flow experience. That is, there will be greater immersive engagement with the task and communication through the built platform with their teacher. In addition, we expect fewer requests for extensions, fewer queries about the specificities of each of the tasks and better-quality assignment submissions because they are not left to the last minute.
Fig. 3.1

Conceptual framework

The built ‘assessment’ artefact is designed according to the cognitive load and flow theories with attention to immersion, interactivity, increasing complexity, informed teaching and instructional design. These elements are achieved by attention to the assessment storyline, developed as a result of a short survey of students about the course topics they found cognitively challenging. Python coding contributed to scaffolding the assessment design.

Artefact Design

Survey to Inform Storyline Design

A survey was conducted with final year management accounting students across two campuses (Singapore and Melbourne) over three semesters, asking them about topics they found most cognitively challenging. The question asked students to list their top three (3) challenging topics from the following 10 items.
  1. 1.

    Organisational Strategy and Management Accounting Control Systems;

     
  2. 2.

    Organisational Structure and Responsibility Centre;

     
  3. 3.

    Transfer Pricing;

     
  4. 4.

    Budgeting and the Strategic Management of Costs and Revenues;

     
  5. 5.

    Behavioural Approaches to Budgeting;

     
  6. 6.

    Static and Flexible Budgets;

     
  7. 7.

    Budgeting and Market Responses;

     
  8. 8.

    Performance Measurement and Reward Systems;

     
  9. 9.

    Non-Financial Performance-Strategy Maps and the Balanced Scorecard;

     
  10. 10.

    Risk Management.

     

From 858 students, we received 283 responses, representing a 33% response rate. Results indicated that they found a qualitative balanced scorecard and non-financial performance evaluation (9) and risk management (10), along with a quantitatively challenging topic transfer pricing (3). We used this final year student data to redesign the first-year course, to ensure our immersive digital designs would contribute to an overall scaffolded approach to learning. By breaking down the complexity of the balanced scorecard to a practical ‘immersive’ experience, we were hoping that by the time our students faced third year, they would find this topic less cognitively challenging. We also added quantitative items in the digital design that would later build to the more complex transfer pricing topics experienced in their third-year courses.

Coded Storyline

The tailored storyline is designed to promote authentic assessment and foster individual creativity by engaging business students in developing their own business plan for an idea that would not only improve business performance but also address the broader impacts on the United Nations Sustainable Development Goals (SDGs). Students were required to select their own avatar (or perspective) that they take when they pitch their business case to generate interest and potential funding to help realise the idea in practice. They might be an owner of a company, marketing manager, supply chain manager, CEO, CFO, etc., giving them more autonomy in the project itself and contributing to the flow experience and level of ‘immersion’ in the topic. By encouraging them to be innovative with their idea, this also contributes to achieving flow experience as measured by Annetta (2010).

The extent of interactivity was limited to the interactions between the student and their teacher at each designated stage of the project. The system administrator has the capacity to intervene, provide more instructions, change questions, set deadlines as well as hide and lock cells. In this example, the idea section was ‘locked’ after a designated deadline to ensure the rest of the business report remains unique to the idea, and students cannot deviate from their original plan. This was decided as we wanted students to engage with and ‘own’ their own idea from the beginning to end. Locking the template ensured students could not adjust their idea to make it ‘easier’ for themselves or provide an easier avenue for plagiarism or contract cheating. We considered the major project as an exam equivalent, hence posed these stricter conditions.

Increasing complexity of the project was an important part of the design with the complexity increasing as the students learned the topics in class. They were required to begin to pitch their idea in terms of explaining how it would contribute to society (economically, environmentally and socially). Then they were required to use the SDGs to determine how performance would be measured and evaluated. Next, they had to consider the costs associated with bringing the idea to fruition and the template was coded to use randomly generated numbers for individual students to calculate cost-volume-profit (CVP) and financial performance in terms of ratio analysis. Because of the student-entered qualitative data and the randomly assigned input information, requiring different calculations, every assessment piece was unique, requiring individualised written interpretations. Students are unable to anticipate questions, as these are blocked from view until the designated time. We did not allow students to change their previous inputs without staff feedback and intervention. The entire course content is matched with topic content, so learning and assessment can be managed in staged developments. Informed teaching and regular feedback are designed to contribute to the immersive learning experience.

Digital Artefact Design

The digital artefact was coded in Python and situated on Amazon Web Server (AWS). Through Microsoft API, students and teachers use their RMIT login credentials to enter their learning or teacher interface, respectively. The digital platform can be accessed on computers, tablets or mobile phones. Students advised us that they used their mobiles to enter thoughts and ideas, even while commuting and would continue to develop until the deadlines. The current server structure and flow diagram of the digital artefact design is indicated in Fig. 3.2, with plans to improve this structure further with the autoscaling technique.
Fig. 3.2

Current server structure

This greater flexibility was designed to support work–life balance and other challenges students face getting their work done in a timely manner. The dynamic interface embeds instructions for students. As soon as similar queries emerge through email queries, this can be answered by the administrator (course coordinator) entering direct edits in the interface to ensure all students are clear about the instructions and goals to be achieved. This dynamic feedback and adjustment are essential in large course delivery and also contribute to the well-being of the teaching staff, who do not have to repeat instructions over and over. Even if students are provided updated details on the Learning Management System (LMS), our experience is that students tend not to search for clarification updates on the LMS but go directly to their instructors. By updating the system, the students avoid the cognitive extraneous effort or ‘search’ for additional information to support and confirm understanding.

An example of the digital learner interface is provided in Figs. 3.3, 3.4, 3.5 and 3.6 and teacher interface in Figs. 3.7, 3.8 and 3.9. Figure 3.3 authentication process is completed by using their RMIT student email and password. No extra registration is required to access the system. Once logged in, all the active assignments will be displayed to students. Students can select which project they would like to work on by clicking the blue button ‘Business Plan’ (or another identifying label). This screenshot indicates how we use WritePal across different courses and jurisdictions. In Fig. 3.5, note that the due date is clearly identifiable for students and the tasks that followed appear blurred, until the designated release time. Students cannot see the questions but know how many they must complete to finish the assignment. The instruction for each question is clearly displayed. Input can be written in numerical or picture format. In one question, students are asked to present an organisational diagram and draw a value-chain activity diagram. Students can hand draw and upload a photo, or they can graphically design in a word document to upload to the platform. The teacher platform enables student tracking of performance and time when each task is completed. Further graphical design features can be included, such as badges or tokens on recognising each task completed.
Fig. 3.3

Learner interface-login page (WritePal screenshot)

Fig. 3.4

Learner interface-assignment selection page (WritePal screenshot)

Fig. 3.5

Learner interface-blocked questions (WritePal screenshot)

Fig. 3.6

Learner interface-feedback and multiple deadline display (WritePal screenshot)

Fig. 3.7

Teacher interface-course management portal (WritePal screenshot)

Fig. 3.8

Teacher interface-assignment marking display (WritePal screenshot)

Fig. 3.9

Teacher interface-marking process display (WritePal screenshot)

Figure 3.6 shows how the feedback is displayed to students. In the WritePal system, the student will be able to review instant feedback for each question and the score for the question. Teachers have the option to set a designated feedback release date or result release date. The multiple deadlines are clearly shown on the right-hand side with upcoming deadlines shown in red as a reminder. Recall that students cannot change their previous inputs without staff feedback and intervention, unlike the traditional assessment which could be easily sent to a contract writer to complete.

Figure 3.7 provides an overview of the teacher’s assignment management portal. In this portal, teachers can add courses, semesters and seminars; set up questions; add users; give extensions; and review student’s responses. Figure 3.8 displays how a teacher reviews, provides feedback and checks for marking memo. On the left side of the figure is the student response. The system is set for teachers to select a question type. For example, refer to Q9 (Fig. 3.7). This question requires students to upload an image of their company value chain. In the middle of the figure are teacher comments and scores. In the right of the figure is a marking guideline memo providing instructions to staff. Figure 3.9 displays the interface for markers to review how many papers/questions are unfinished. Given the assignment is staged, it is important to show how many questions have been answered and how many questions remained ungraded so teaching staff have clear instructions. As indicated in Fig. 3.8, the interface clearly displays student information and the marked and unmarked questions.

The system also offers both teachers and students opportunities to generate and download the full report by clicking the ‘report’ button when necessary. For students, this button is made visible once the report is completed.

Data Collection and Analysis

Data was collected in accordance with RMIT ethics guidelines. Survey data was collected from the final year on topics they found challenging. This initial data was used to inform the digital artefact design. We then pilot tested the digital artefact in our first-year accounting course, which comprises students undertaking business degrees across a number of major programs including accounting, finance, economics, law, management marketing, supply chain and logistics. This is a large course with enrolments ranging 1200–1900 per semester.

Secondary data from the course experience survey and comments from teachers are used to evaluate the flow experience of our students undertaking assessment through the digital artefact. We used Annetta’s (2010) eight elements of flow for immersive digital designs to analyse the data. The standard questions on the course experience survey ask students to comment on what is best about the course as well as what they think should be improved. Because the digital artefact was being used as a formal assessment piece, we did not want the potential for biased feedback; hence the decision to evaluate what emerged ‘unsolicited’ from the students at the end of the semester. We did not ask any specific or additional questions about this assessment piece.

We also used data in relation to the number of student assignment extension requests, along with direct evidence from the teacher interface, to determine whether students believe the task can be successfully completed. The other items are gathered from themed analysis of the qualitative responses and evidence presented through the unsolicited student feedback.

Findings

We received positive feedback from both students and staff. We had 979 students and 13 educators teaching in the course. Throughout the semester and staged use of the digital artefact, both staff and students agreed that the interface is easy to use. Students engaged with the template and enjoyed the continuous feedback and ability to adjust their responses and build on them based on weekly topic content. Of the 979 students, we received qualitative comments from 194 students (20% response rate) for the question ‘what is the best part of this course’. Of the responses, 16% explicitly mentioned the digital assessment was what they enjoyed the most and 28% of the responses indicated that they really liked the staged assignment. The course received the highest overall satisfaction rating and good teaching scores on record for a common core course. We believe the following discussion provides evidence of the flow experience of our large student cohort. While we cannot give a definitive measure of flow experience for every student, the following qualitative evidence helps to support that flow was achieved.

Evidence of findings for flow element, (1) Feeling the activity can be successfully completed, we used data for the number of student extension requests which indicated that our number of extension requests dropped by 50%. We also generated a graph of the system data (Fig. 3.10) which demonstrates that students remain active before and after the due date for each of the subsequent stages.
Fig. 3.10

System data indicating due date and activity in the system (WritePal screenshot)

The graph is also useful in demonstrating the extent to which (2) The player can concentrate fully on the activity. Because there is evidence of ongoing activity during the semester, we can claim there must be ongoing concentration and the ability to go in and out of the activity and pick up when required. We also found several qualitative responses from the Course Experience Survey (CES) indicating that students were able to engage with the assessment as an ongoing activity:

I found the assignment very approachable (individual business report), it gave a step by step guideline of what it is expected and what is needed giving a feeling that I am being provided assistance along the way. I did not find any difficulty doing the assignment, which makes me feel motivated in this course. (student CES response)

In further analysing, the entire qualitative dataset data visualisation techniques were utilised. Figure 3.11 provides an overview of the keywords that emerged from the data.
Fig. 3.11

CES data visualisations.

Source adapted from CES qualitative responses

Figure 3.11 also helped to indicate that the students felt that they were able to accomplish the task as the data visualisation helped to indirectly demonstrate that (3) The activity has clear goals. We did not find glaring evidence that students were not sure of the assessment requirements and conducted a more detailed examination of the use of some of the apparent negative words. We found that the word ‘deadlines’ was a positive response to the staged approach and that the tasks were ‘broken’ into manageable parts. Furthermore, the CES results also indicated that 92% of student respondents agreed that they met assessment deadlines. Most importantly the word ‘stressful’ designated that students felt that the digital tool made the experience less stressful. This was evidenced in one of the students’ comments:

I love the assessment layout for the report. Easy to use, easy to meet deadlines, not too much to stress over and it encourages me to get it all done early rather than procrastinating and leaving doing the whole report till the last minute and stressing out the day before it’s due. I wish every course could adopt this style. Unlike all my other courses, I’ve never had a panic attack when working on this assignment, especially after procrastinating, thank you so much. (student CES response)

Another student liked our experimentation and expressed the following sentiment:

......Courage to try new ideas in order to help and improve student engagement. The new system used to incrementally complete our individual business report is carefully designed for students in mind, and I believe it is much better than a standard assignment of completing everything by a due date. It is also fairly user friendly. (student CES response)

In evidencing that (4) The activity provides fast feedback, at this stage of the project design—we can only provide evidence of teacher interaction. If the digital elements are further enhanced to provide automated feedback to students (for example, through algorithms, Bots and AI), we can address this area further. At this stage, we are working with a simplified digital design. During one stage of the question release, we received two (2) emails simultaneously from students asking the same question. We immediately responded to the students directly and updated the instructions in the digital template and this stopped all further questions in relation to clarifying the goals of that specific activity.

In terms of flow element (5) The player is deeply involved in the activity, student engagement was evidenced in their novel business case ideas. Many came and explained their innovative approaches to us, largely because they were extremely proud of their big ideas. The entrepreneurial spirit was evident in the teacher feedback too. While we cannot determine the level of involvement in the activity for every individual student, we can only surmise from the CES comments and the interactive feedback provided to use during the semester, that a large percentage of the student cohort were willing to engage. The feedback acknowledged the creative freedom as well as the pragmatic gains associated with this activity:

It is kind of interesting. I like assignment 2 because it allows for freedom and creativity (student CES response) and,

The assignment where students are able to use their interests in making a business plan, which helps with motivation and connection to the content (student CES response).

In determining the flow experience of this digital artefact and the response to the element that (6) A sense of control over the actions is necessary to perform the activity, the qualitative evidence from students helped demonstrate their comfort with the task at hand, in which they would not lose control in meeting deadlines, or of conceptual knowledge development throughout the 12-week semester:

I really liked how the report was a staged submission. It allowed me to focus on a part at a time and do my best work. This also ensured I didn’t leave things to the last minute. (student CES response)

…, the individual assignment was a good idea to have going through the semester, with learning content so you can practise the stuff you learn while moving through. (student CES response)

It’s great that the Business Plan assignment was done in stages as it made it less stressful to complete. (student CES response)

The business report was the best aspect because it allowed us to sequentially submit parts of our assignment. This meant that we weren’t stressed about completing the whole assignment by the due date, but rather focus on certain aspects and spend time on each part. I was really motivated to do this business report and it’s the first time I enjoyed doing a report. (student CES response)

…Digital learning and assessments particularly the online system that was developed for the business report is making students in-control as they feel self-confident and independent… (academic peer evaluation response)

The final elements of flow—(7) Self-awareness disappears during flow and (8) There is an altered sense of time—are difficult to claim evidence without directly asking or observing the students. The data visualisations from all positive and negative CES responses indicate that the most dominant word is ‘engaging’, meaning that overall the students were satisfied with their course experience. While the other dominant words ‘structured’ and ‘deadlines’ that were clearly aligned with the digital artefact also could be viewed as negative, a more detailed analysis of the comments around these words indicates that the students enjoyed the formal way they were navigated through the system. Nevertheless, this also indicates that the scaffolded, layered deadlines evident in the course design potentially are a trade-off to the latter two flow elements.

Nevertheless, in handing control back to the students, we explained that the important ‘print’ button at the end of the task ensured that the template they used would print to a formatted document that they could be proud to take to job interviews. We also demonstrated that the SDG topic is relatively new to businesses and having expertise and understanding in this area is valuable for future employers. Part of the sense of involvement in the activity is related to the direct link to the goals of authentic assessment for students to develop twenty-first century skills and be work ready.

The game aspect was really engaging. The teaching team was extremely helpful and friendly. The course content itself was not something I thought I would enjoy however it was super interesting and taught me skills and concepts that seem useful for my future as a possible employee or employer. (student CES response)

The findings contributed to confirming that the implementation of a scaffolded, staged approach not only contributed to the flow experience of students but also contributed to their cognitive load.

Discussion, Conclusion and Limitations

The digital artefact was a relatively successful pilot experiment built on cognitive load and flow experience literature. We consider success in terms of meeting the flow experience criteria designated by the psychology literature dealing with immersive technologies (Annetta, 2010; Pearce et al., 2005) and contribute to the emerging but minimal literature in this area (Bitrián et al., 2020). While we were able to describe many of the eight (8) flow criteria, some of the elements were harder to directly evidence, thus requiring further exploration in future research initiatives. We also relied on secondary data, and unprompted qualitative responses to measure student flow experience. In recognising this as a potential limitation to the study, further evidence, through targeted surveys and interviews, is recommended.

To date, the system appears to minimise plagiarism and contract cheating problems identified by Bretag et al. (2019). This is arguably due to the personalised, unique and progressively released questions that build on the previous inputs. While we cannot be definitive, the ability for contract cheating is harder when assessments are staged and not all questions are made available at the outset. The most important aspect of the system is that it can continually be adapted to new queries, topics and new assessment formats (written, graphics, numerical, etc.). This means the system is transferable across semesters, courses and educational disciplines.

The flow experience, an important part of understanding the degree of engagement and immersion with the digital artefact with the benefit of this dynamic digital artefact, is that other aspects, such as identity and interactivity, can be developed further and tested in new iterations of this digital assessment design. The elements of increasing complexity are important in addressing the ability of the digital design to address the cognitive loads of students when being introduced to challenging topics (Sweller, 1988). Likewise, the pedagogical design is in accordance with informed teaching whereby the teacher can play an active role in engaging in the student journey. The findings indicated that this was made possible. We focused more on the student experience, and further research would provide more insights, particularly from the teacher perspective. The instructional design was evident in the student responses, which directly links to the formality of the scaffolded approach identified as important in the cognitive load theory literature. However, this was not exploited as much as the emerging flow literature in digital pedagogy would expect. Further iterations of the digital artefact could also engage with more gamified elements such as rewards, badges and leader boards. We also recommend further longitudinal research that follows the first-year students through to their third year, to determine whether this first-year assessment experience contributed to long-term germane, cognitive benefit. We could also give the same digital assignment to different student groups, test different design features, such as staged deadlines, as well as address the cognitive load benefits between the different experimental groups.

In conclusion, we consider that the digital artefact has contributed to relieving key pressure points for both academics and students including last-minute assessment preparation, poor-quality assessment designs and submitted work, plagiarism and contract cheating, minimal opportunities for regular academic engagement, and feedback and overall well-being concerns.

References

  1. Annetta, L. A. (2010). The “I’s” have it: A framework for serious educational game design. Review of General Psychology, 14(2), 105–112.Google Scholar
  2. Bates, A., & Sangrà, A. (2011). Managing technology in higher education: Strategies for transforming teaching and learning. San Francisco: Jossey-Bass/John Wiley & Co.Google Scholar
  3. Bellotti, F., Ott, M., Arnab, S., Berta, R., de Freitas, S., Kiili, K., … De Gloria, A. (2011). Designing serious games for education: from pedagogical principles to game mechanisms. In The 5th European Conference on Games Based Learning (pp. 26–34). Greece: University of Athens.Google Scholar
  4. Bitrián, P., Buil, I., & Catalán, S. (2020). Flow and business simulation games: A typology of students. The International Journal of Management Education, 18(1), 100–365.Google Scholar
  5. Bretag, T., Mahmud, S., Wallace, M., Walker, R., James, C., Green, M., et al. (2016). Core elements of exemplar academic integrity policy in Australian higher education. International Journal for Educational Integrity, 7(2), 3–12.Google Scholar
  6. Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., van Haeringen, K., et al. (2019). Contract cheating and assessment design: Exploring the relationship. Assessment & Evaluation in Higher Education., 44(5), 676–691.Google Scholar
  7. Brimble, M. (2016). Why students cheat. In An exploration of the motivators of student academic dishonesty in higher education (pp. 1–14). Handbook of Academic Integrity.Google Scholar
  8. Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. San Francisco: Jossey-Bass.Google Scholar
  9. Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Journal of Leisure Research, 24(1), 93–94.Google Scholar
  10. Debue, N. & Leemput C. V. D. (2014) What does germane load mean? An empirical contribution to the cognitive load theory. Frontiers in Psychology.Google Scholar
  11. Ely, D. P. (1990). Conditions that facilitate the implementation of educational technology innovations. Journal of Research on Computing in Education, 23(2), 298–305.Google Scholar
  12. Gerjets, P., & Scheiter, K. (2003). Goal configurations and processing strategies as moderators between instructional design and cognitive load: Evidence from hypertext-based instruction. Educational Psychologist, 38(1), 33–41.Google Scholar
  13. Giasiranis, S., & Sofos, L. (2017). Flow experience and educational effectiveness of teaching informatics using AR. Educational Technology & Society, 20(4), 78–88.Google Scholar
  14. Harper, R., Bretag, T., Ellis, C., Newton, P., Rozenberg, P., Saddiqui, S., et al. (2019). Contract cheating: A survey of Australian university staff. Studies in Higher Education, 44(11), 1857–1873.Google Scholar
  15. Herrington, J., & Standen, P. (1999). Moving from an instructivist to a constructivist multimedia learning environment. In B. Collis & R. Oliver (Eds.), In ED-MEDIA 1999–World Conference on Educational Multimedia, Hypermedia & Telecommunications (pp. 132–137). Seattle, WA USA: Association for the Advancement of Computing in Education (AACE).Google Scholar
  16. Jackson, S. A., & Marsh, H. (1996). Development and validation of a scale to measure optimal experience: The flow state scale. Journal of Sport & Exercise Psychology., 18(1), 17–35.Google Scholar
  17. Ketamo, H., & Kiili, K. (2010). Conceptual change takes time: game based learning cannot be only supplementary amusement. Journal of Educational Multimedia and Hypermedia, 19(4), 399–419.Google Scholar
  18. Lameras, P., Arnab, S., Dunwell, I., Stewart, C., Clarke, S., & Petridis, P. (2017). Essential features of serious game design in higher education: Linking learning attributes to game mechanics. British Journal of Educational Technology, 48(4), 972–994.Google Scholar
  19. Liao, L. F. (2006). A flow theory perspective on learner motivation and behavior in distance education. Distance Education, 27, 45–62.Google Scholar
  20. Paas, F., Renk, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4.Google Scholar
  21. Pearce, J. M., Ainley, M., & Howard, S. (2005). The EBB and flow of online learning. Computers in Human Behavior, 21(5), 745–771.Google Scholar
  22. Price, L., & Kirkwood, A. (2013). Using technology for teaching and learning in higher education: A critical review of the role of evidence in informing practice. Higher Education Research and Development, 33(3), 549–564.Google Scholar
  23. Shin, N. (2006). Online learner’s “Flow” experience: An Empirical study. British Journal of Educational Technology, 37(5), 705–720.Google Scholar
  24. Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 139–297.Google Scholar
  25. Sweller, J. (1994). Cognitive load theory, learning difficulty and instructional design. Learning and Instruction., 4(4), 295–312.Google Scholar
  26. Sweller, J., Merrienboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.Google Scholar
  27. United Nation (UN). (2018). Sustainable Development Goals, United Nations Department of Public Information. https://sustainabledevelopment.un.org/?menu=1300.
  28. van Schaik, P., Martin, S., & Vallance, M. (2012). Measuring flow experience in and immersive virtual environment for collaborative learning. Journal of Computer Assisted Learning, 28(4), 350–365.Google Scholar
  29. Walker, M., & Townley, C. (2012). Contract cheating: A new challenge for academic honesty. Journal of Academic Ethics, 10(1), 27–44.Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.RMIT UniversityMelbourneAustralia

Personalised recommendations