Keywords

1 Introduction

Technologies have been used to transform the students’ learning experience in a lecture hall. For example, instructors adopted television and radio to capture the students’ attention in the early days [1]. Then, computer and projector became the necessary tools in every lecture hall. However, these technologies are not design to encourage student participation in traditional lecture hall and the students are more likely to sit passively for entire lecture session. New mobile classroom response system has emerged as a promising technology to engage large audience using mobile devices [2].

Mobile classroom response system, formerly known as clicker [3], is a management tool for instructors to deliver an interactive lecture especially in a large classroom setting. This system is known under various names such as personal response system [4], student response system [5], audience response system [6], electronic voting system [7], wireless keypad [8], and classroom communication system [9]. Throughout this paper, the term mobile classroom response system will refer to an evolution of clicker that works on mobile platform and has enhanced capabilities of a typical clicker.

There are three main components of a mobile classroom response system, namely (1) questioning and presentation, (2) response and display, and (3) data management and analysis [10]. So, this system allows an instructor to post a question, the students submit their responses using their mobile devices and an overview of students’ answer is instantly made available on a main projector screen for entire class discussion. After the session ended, the instructor can save every student responses for future analysis, especially to review each student performance.

A considerable amount of literature has been published on the positive impacts of clickers [3, 11, 12]. For examples, acquisition of advanced reasoning skills [13], improvement in student attendance [14], and greater positive enjoyment among students [15]. From the education perspective, these impacts can be categorized into three types of engagement such as cognitive engagement, behavioral engagement and emotional engagement [16]. In short, cognitive engagement can be defined as student psychological investment in learning, the term behavioral engagement refers to student participation in classroom activities, and emotional engagement can be described as student affective reaction in classroom.

On the other hand, user engagement can be viewed as both an outcome of experience and a process during an interaction [17]. Several studies had shown temporal dynamics of user engagement in different contexts such as reading online news [18], writing documents [19], viewing television [20], learning in blended classroom [21] and participating in face-to-face classroom [22]. Most studies, as far as the authors are aware, only focused on patterns of cohort transitions when using clickers [23] and their studies are applicable for peer-instruction activities [24]. This study seeks to explore alternative patterns of user engagement during mobile classroom response system session over a prolonged period of time. This study intends to unravel the dynamic interaction between students and mobile classroom response system. Thus, this study employs a mixed-methods approach to gain an in-depth understanding of user engagement across time.

The remainder of this paper is organized as follows: Sect. 2 reviews the user engagement. In Sect. 3, the methodology used for this study are described. Section 4 presents the findings of the study. The paper concludes in Sect. 5.

2 User Engagement

One of the challenges for user experience researchers is to engage a person using a particular system in a specific context [25]. Different contexts have different user engagement measurements. For example, attendance is one of the common measurements for user engagement in mobile classroom response system. However, attendance may not be a good indicator because certain educational institutions imposed exam barring based on student’s poor attendance. Thus, these students are more likely to maintain a good record of their attendance with or without using mobile classroom response system.

In this paper, user engagement can be defined as a quality user experience when a person was pleased using a particular technology and a desire to use the technology more frequently. The user experience can be divided into four types of time span, namely anticipated user experience, momentary user experience, episodic user experience and cumulative user experience [26]. In other words, a user experience may refer to a user imagination before first usage (anticipated user experience), a user feeling during an interaction (momentary user experience), a user appraisal after a particular usage (episodic user experience) or a user view on a technology as a whole over multiple periods of usages (cumulative user experience). Hence, different methods are required to uncover user experience at different time spans. This paper reviews longitudinal methods from human computer interaction perspective because this study attempts to discover the changes of user engagement over time.

2.1 Interaction Log

Interaction log involves tracking a student’s behavior automatically when he/she using a mobile classroom response system. The interaction log can reveal students’ usage patterns through the whole lecture session and even across an entire trimester. This technique is easy because the instructor can retrieve interaction log directly from the mobile classroom response system. In addition, this method is more scalable compared to human observer because human may not able to track all behavior changes manually in a lecture hall [22]. However, interaction log is also considered as an implicit feedback because an inference is made based on user actions and it may not tell whether a student is really engaged or not. Thus, diary study can be used to complement the findings of interaction log.

2.2 Diary Study

Diary study is another form of logging methods where participants are asked to write an entry about their personal experiences using a particular technology. The diary may contain facts as well as subjective assessment such as feelings and impressions. Diary can help to reveal real student issues and needs in the context of mobile classroom response system. There are three types of diary entry such as interval-contingent protocol, signal-contingent protocol, event-contingent protocol [27]. So, students are required either to document their experiences at fixed intervals (interval-contingent protocol), to make entry when prompted by a signaling device (signal-contingent protocol) or to report each time a particular event occur (event-contingent protocol).

Due to the nature of lecture hall setting, this study employs interaction log based on mobile classroom response system and diary study in order to measure user engagement over time. Both techniques allow researchers to track students’ activities indirectly and students can participate in lecture activities without any distraction.

3 Methodology

A case study was conducted in Multimedia University, Malaysia and based on the university database, ninety five undergraduate students registered officially for software requirements engineering in Trimester 2 (2015/2016). Software requirements engineering is considered one of the challenging computing subjects. This subject covered multidisciplinary fields ranging from social sciences to computer science concepts [28].

Various mobile classroom response systems (such as Formative, Pear Deck, Unitag and Kahoot) were introduced and used during lecture. Each mobile classroom response system has its own unique feature. For example, Formative enables students to pick a correct option under multiple choice question and true/false question. Besides that, students can type short answer and show their drawing in Formative. Pear Deck allows students to response by dragging an icon toward the answer area and enables instructor to see the pattern of all students’ responses on the projector view. On the other hand, Kahoot plays music sound and displays a countdown timer to encourage students to compete with others. Unitag can generate free quick response code and provides instructor with the ability to track student participation in lottery and scratch card games.

For each lecture week, the instructor had carried out three short clicker sessions (beginning of lecture, middle of lecture and end of lecture) using mobile classroom response system. Different types of questions were posed to students during lecture such as probing students’ pre-existing level of understanding, assessing students’ ability to apply lecture material to a new situation, and polling student opinions. Only three questions were posed for each clicker session. After each lecture ended, the students were encouraged to post their learning experiences within three days on a social networking website. There is no limit on the number of words. The students can write their diary entry using English language or create multimedia diary such as images or videos. Figure 1 presents the overview of data collection for each lecture week.

Fig. 1.
figure 1

Overview of data collection for each lecture week

Students were rewarded with additional experience scores for their participation using mobile classroom response system and diaries submission. Those additional experience scores accounted for five marks in their coursework grade. Thus, the students were asked to enter their real name and student identification number for every submission.

The interaction log of mobile classroom response system was used to profile each student and their level of engagement (see Table 1). A user is refers to the distinct user participates during a particular lecture week and can be divided into two sub-types, namely registered student and visitor. A registered student refers to a student who register officially for the software requirements engineering subject in university database and those with non-registered status are classified as visitors.

Table 1. Metrics used in this study

An active participation is counted when a student participated with complete submission in all three consecutively clicker sessions for a particular lecture week. An inconsistent participation is considered when a student participated inconsistently in certain clicker sessions due to partial submission or non-submission. A passive participation is recorded when a student did not participate in all three consecutively clicker sessions due to non-submission.

A complete submission is counted when a student submitted his/her answer for all questions posed during a particular clicker session. A partial submission is considered when a student failed to submit his/her answer for certain question(s) during a particular clicker session. A non-submission is recorded when a student did not submit his/her answer for all questions posed during a particular clicker session.

4 Result

Some of lecture weeks were postpone due to public holidays and other lecture weeks required a paper-and-pencil approach because majority students had difficulty to draw requirements models using mobile classroom response system. Thus, this paper only reported seven lecture weeks where mobile classroom response system was used.

4.1 User Types

There were 102 users found in the result across seven lecture weeks. Ninety five of 102 users were registered students and the remaining were seven visitors. Table 2 summarizes the number of distinct users participated in lecture using mobile classroom response system. The trend showed that the number of distinct users participated at the beginning of trimester was higher (Week 1, Week 2, Week 3) but the distinct users’ participation decreased at the end of trimester (Week 13).

Table 2. Summary of user types

Although the number of distinct users did not exceed the actual registered students for each week, but there was a small number of visitors participated in clicker sessions (except on Week 3) and a further analysis revealed that there was one visitor participated in four out of seven lecture weeks.

4.2 Participation Types

For this section, we reported findings on ninety five registered students only and excluded all visitors from the analysis. Figure 2 shows the participation types of registered students for seven lecture weeks. The trend showed that registered students participated more actively using mobile classroom response system in the early of trimester (74% in Week 1 and 86% in Week 2) compared to the end of trimester (42% in Week 12 and 37% in Week 13). In other words, more than 50% of registered students shown either inconsistent or passive participation during clicker sessions at the end of trimester. However, we cannot make an assumption that a registered student had the same participation pattern throughout the whole trimester.

Fig. 2.
figure 2

Overview of participation types over seven lecture weeks

Thus, a further analysis was performed at individual level and revealed that only around 18% of ninety five registered students were participated actively across seven lecture weeks. The remaining registered students (82%) had inconsistent participation. From the interaction log analysis, we selected two students who participated actively (Learner#45) and inconsistently (Learner#85) during clicker sessions. Figure 3 compares the participation types between Learner#45 and Learner#85. Learner#85 had one active participation (Week 3), five inconsistent participation (Week 1, Week 2, Week 4, Week 6, Week 12) and one passive participation (Week 13). On the other hand, Learner#45 had active participation consistently across seven lecture weeks.

Fig. 3.
figure 3

Comparison of Learner#45’s and Learner#85’s participation types

4.3 Submission Types

In this section, only the active and inconsistent participations of registered students were analyzed to recreate the sequence of behavioral actions that occurred during clicker sessions. Figure 4 provides the submission types of registered students for each clicker session (beginning of lecture, middle of lecture and end of lecture).

Fig. 4.
figure 4

Overall submission types over seven lecture weeks

With a total of 1740 submissions over seven lecture weeks, the analysis showed that the number of complete submissions gradually increased per clicker session (27% at the beginning of lecture, 30% at the middle of lecture and 31% at the end of lecture). On the other hand, the number of non-submissions were highly occurred at the beginning of lecture (5%) compared to at the middle of lecture (1%) and at the end of lecture (1%). The number of partial submissions were frequently happened at the middle of lecture (3%) compared to at the beginning of lecture (1%) and at the end of lecture (1%).

We performed a granular analysis on Learner#85 by looking at her submission types for each clicker session (see Fig. 5). Learner#85 had complete submissions consistently at the end of lecture across seven lecture weeks except for Week 13. However, she tend to have either partial submissions or non-submissions at the beginning of lecture except for Week 3 and Week 6.

Fig. 5.
figure 5

Submission types for Learner#85

4.4 Diary Entries

The students’ diary went through two iterative examinations. The first pass through the data focused on identification of emotional responses related to mobile classroom response system. Then, each excerpt was analyzed using user engagement attributes categorization scheme derived from [29]. The model of user engagement consists of six factors, namely aesthetic appeal, focused attention, perceived usability, novelty, felt involvement and endurability. However, aesthetic appeal was not found in the student diary entries. From the systematic analysis approach, we describe the findings according to the five attributes of user engagement.

Focused Attention.

Focused attention is refers to student concentration on one stimulus only and ignoring others. Sometimes, the student become absorbed with learning activities and surprised at how much time passes. The diary entries revealed that some students put their full attention by listening to lecture so that they can answer those questions posed by instructor using mobile classroom response system.

“Then, coming to the lecture class, same old boring lecture class is not the same anymore. During the first session I almost fell asleep.. sorry. But surprise surprise, 3 super simple questions to answer after every session, when I say super simple I mean provided that you pay attention in class. Well, it’s a really good way to help me focus in class honestly, especially by using goformative, it makes it so simple and easy for us to access and answer the questions.” (Learner#28)

“With every segments of the lecture, we are provided with a set of questions based on what was taught on Formative. This kind of new learning environment made me somewhat excited towards the lecture and what was taught, I began to listen more attentively instead of staring down the table to my phone screen like I used to do… As we transitioned from lecture to answering question in Formative from time to time, I didn’t even realized that two full hours have passed. With this new learning experience that I had, I hope that it will continue enriching me throughout this semester.” (Learner#26)

Novelty.

Novelty can be defined as a new, interesting or unexpected situation that increases students’ curiosity and rouses their inquisitive behavior. In other words, those sudden and surprising changes evoke a positive reaction, such as excitement and joy, from the student. The students highlighted the uniqueness of each mobile classroom response system being introduced in different lecture and how these mobile classroom response systems are effectively blended in lecture.

“Class activity in this week we are using a website call “Peardeck”. “Peardeck” are different from goformative, the way to answer some of the question in “Peardeck” are different, we using the red-dot to point to the answer instead of typing the text to answer.” (Learner#72)

“In the end of this lecture, the lecturer asked us to vote for the date and duration for the replacement class and for the midterm test and presentation by using Kahoot. The Kahoot is good as it is new to us and voting will show more democratic results.” (Learner#88)

Perceived Usability.

Perceived usability is defined as cognitive and affective aspects when dealing with mobile classroom response system. Sometimes, students may experience frustration when they could not complete certain tasks using mobile classroom response system. The students shared their negative experiences such as difficulty to draw requirements model on their mobile device due to small screen size. Some students highlighted that they could not participate because their mobile phone were out of battery or exceeded mobile data quota. A high rate of inconsistent participation in a particular lecture supported these students’ comments.

“Although we are using back the GoFormative but the method that used to answer the question seriously difficult for the big-hand person. The mobile phone’s screen so small and my hand so big (T^T) I have try to write the answer on the blank box for more than twice. And the most important is the system did not provide the erase to users which mean if you write wrongly then you have to “re-draw” it.” (Learner#68)

“The goFormative activity today caused some problems for a few students. (Phone cannot snap photo, cannot upload photo, phone no battery) … Not sure if goFormative or their phone is at fault.” (Learner#87)

Felt Involvement.

Felt involvement can be described as student’s feeling of the overall learning experience as fun. In other words, the students are enjoyed with learning experience using mobile classroom response system during lecture.

“Software Requirement Engineering (TSE2451) it sounds like a dull and boring subject, but it is quite fun when I attended the first class. First, this is not the first subject that use online assessment platform to evaluate students, but this is a first class that give me a chance to get bonus marks by earning experience marks from the online assessments. Second, I like the teaching style of the lecture. Lecture will prepare an online assessment about the topic after each lecture.” (Learner#37)

“Waw….I-nigma, the QR code scanner is the best. It just take less than one second to bring me to the web page. Fantastic! With all the interesting activities in the class, I have never falling asleep in the class. I’m so scare that I will miss the fun games.” (Learner#35)

Endurability.

Endurability is refers to likelihood to remember things that students have enjoyed and a desire to repeat a fun activity. Thus, a positive past experience using mobile classroom response system would encourage the students to do the same activity again in the future. Some students indicated in their diary that they were excited and eager to come again for the next lecture.

“The past two week class for this subject was quite fun with the use of goformative.com. I really love the way the lecturer conducting the class by providing us short quizzes every time we finish a subtopic. This method is actually quite efficient because it helps me to understand better on what the lecturer teaching us. So far I scored well in the quizzes and that really helps my confidence level and make me feel that I can do really well for this subject for this semester. I’m looking forward for next week lecture class.” (Learner#81)

4.5 Holistic View on User Engagement Patterns

By connecting the two students’ diary entries with their interaction logs, we were able to understand that both students (Learner#45 and Learner#85) had positive engagement using mobile classroom response system. As shown in Table 3, six diary entries shared by Learner#45. In Week 1 and Week 2, Learner#45 highlighted that she was actively thinking when answering those questions posed using mobile classroom response system (Formative). She also had fun time with her friends during Week 3. On Week 4, she had a new experience using a different mobile classroom response system (Unitag) and excited to attend the next lecture. She highlighted a new mobile classroom response system (Kahoot) in Week 6 and shared her overall positive learning experience using various mobile classroom response systems in Week 13.

Table 3. Diary entries of Learner#45

Learner#85 also submitted her diary entries in this study (see Table 4). On Week 2, she indicated her reason for being late (inconsistent participation) but she was satisfied with her own performance using mobile classroom response system (Formative). She also shared her positive engagement with her new friends in Week 3. She highlighted a new mobile classroom response system (Unitag) on Week 4. For Week 6 and Week 12, she faced some challenges such as wireless local area network and drawing on mobile device (inconsistent participation) but she was happy with voting result (Kahoot) and electronic lucky draw (Unitag) using mobile classroom response system.

Table 4. Diary entries of Learner#85

5 Conclusion

In general, this paper has shown a different insight on user engagement by employing both interaction log and diary study methods. This mixed-methods approach unravels the temporal dynamic of user engagement using mobile classroom response system. The results of this study allow researchers to understand why students were engaged during lecture. These student behaviors may not be obvious in a large classroom setting because the lecturer’s goal is to deliver knowledge and share the knowledge in an engaging setting. This information could help lecturers to deliver an engaging lecture by using appropriate mobile classroom response system in the future.