There is a continuing search on how to improve the quality of higher education so that students are able to achieve the intended learning goals. Research has shown that active learning is a promising means to this end (Freeman et al. 2014; Prince 2004). An increasingly popular method, which aims to actively engage students, is known as flipping the classroom. The organization of flipped classroom courses requires students to prepare for in-class meetings, often facilitated through supportive online video lectures, and demands involvement during lectures by means of problem solving and peer instruction (Abeysekera and Dawson 2015). A key feature that enables the change in time and place of various learning activities in the flipped classroom is the systematic use of technology both during pre-class and in class activities (Strayer 2012).

Previous research showed mixed effects of flipped classroom implementation on student performance (Davies et al. 2013; Mason et al. 2013; McLaughlin et al. 2013; Pierce and Fox 2012; Street et al. 2015). Tomes et al. (2011) argued that in order to better support learning, educators need more insight into student perceptions of effective study strategies and how students go about studying as they prepare to demonstrate their understanding of the material in course assessment. This is especially important in the context of implementing the flipped classroom since students need to be willing to change their study behaviour. As it is unclear to what extent students comply with the changes expected from them, the present study aims to fill this gap in the literature by exploring the study behaviour of students throughout both a flipped and a non-flipped (henceforth referred to as regular) college statistics course.

The benefits of active learning

Lectures where students passively receive information seem to be less effective than lectures where students actively participate (Prince 2004). In a meta-analysis spanning the Science Technology Engineering and Mathematics (STEM) disciplines, Freeman et al. (2014) found an effect size of .47 in favour of active learning over lecture-based courses. The effect was smaller for large classes (>110 students), compared to small (<50) and medium (50–110) classes, and the effect was also somewhat smaller for studies in psychology compared to other disciplines. Furthermore, the effect sizes did not differ substantially depending on the methodological rigour of studies included, from quasi-experimental to randomized controlled trials. In active learning research, however, the implementation focus is clearly on change in the lecture setting without explicit consideration of what happens outside of the lecture setting. Therefore, it is unclear if the benefits of active learning found by Freeman et al. (2014) can be expected when the flipped classroom is implemented.

A recent study in a small class found no substantial difference in student performance between a flipped active learning course and a regular active learning course (Jensen et al. 2015). Other research on the benefits of implementing the flipped classroom has shown varying results with different methodologies. In studies where cohorts of subsequent years are compared with medium to large groups of students (as defined by Freeman et al. 2014), flipped classroom cohorts were found to outperform regular cohorts (Pierce and Fox 2012; Street et al. 2015). In studies with smaller numbers of students (10–50), differences in student performance were not statistically significant (Davies et al. 2013; Mason et al. 2013; McLaughlin et al. 2013), which could be due to a power problem. The advantage of cohort comparison studies is that the material and instructor may remain constant. A disadvantage, however, is that exams may be different from year to year, and in that case, difference in performance between cohorts may be the result of differences in exams rather than the implementation of the flipped classroom. In the present study, it was not feasible to conduct a cohort comparison study since university policy prohibits identical exams in subsequent cohorts.

Using a different design with two groups of students following the same course in a flipped and a regular format simultaneously, Tune et al. (2013) found that students in the flipped course outperformed the regular course. The class sizes in this study, however, were very small for each course, and insufficient information was reported for effect sizes to be computed. Although there is increasing research on student performance in the flipped classroom, there is very little research on how students study. Tune et al. (2013) found that more than half of the students indicated watching 75–100% of the video lectures. This does not, however, give insight into whether students used the video lectures as intended to prepare for class or whether it was an additional resource used to study prior to the final exam. Rather than compare the performance of students in a regular and a flipped course, therefore, the present study explored study behaviour and the extent to which it is related to student performance in these two different contexts.

Research on student engagement

A common definition of student engagement used in the National Survey of Student Engagement (NSSE) is the time and energy students invest in educationally purposeful activities (Kuh et al. 2008). However, defining student engagement and how to study it is a continuing debate (Ashwin and McVitty 2015; Kahu 2013). In the NSSE, time spent studying is operationalized as the average number of hours each week, with a choice of three categories (<5, 5–21, >21 h per week). Educationally purposeful activities are operationalized as a list of activities with a Likert-type response scale ranging from very often to never (Kuh et al. 2008). The advantage of this approach to measuring how much time and energy students spend studying is that it can be compared across institutions. A limitation, however, is that it does not give insight into the varied nature of how students study throughout specific courses. Therefore, the operationalization of study behaviour as student engagement in the NSSE is of limited use in the study of change in a specific educational context like the implementation of a flipped classroom.

Another instrument measuring student engagement is the Course Student Engagement Questionnaire (Handelsman et al. 2005). While this instrument is helpful in understanding the multifaceted nature of student engagement, it approaches study behaviour as an individual trait that does not depend on or change throughout different courses. The same applies to instruments commonly used from other theoretical approaches such as Vermunt’s Inventory of Learning (Vermunt and Vermetten 2004), the Study Process Questionnaire (Biggs et al. 2001; Fox et al. 2001), and the Motivated Strategies for Learning Questionnaire (Pintrich et al. 1993; Credé and Phillips 2011). There are no validated instruments to systematically study the actual behaviour of students throughout a course in a specific context. When behaviour is the focus of change in innovations such as the flipped classroom, however, it is very important to gain insight into the mechanism that is targeted. For this reason, the present study used a diary-type instrument that was designed for the present study, similar to the approach used by Tomes et al. (2011).

In the 1980s, students’ study behaviour and curriculum characteristics were studied using behaviour diaries in the Netherlands on a large scale (i.e., Van der Drift and Vos 1987). These studies showed that students concentrated their study time in the days before the exam in lecture-based courses and did not spend much time studying throughout the course, unless specific deadlines in the course required them to do so. Research on the relationship between student performance and time spent studying has yielded mixed results (Credé et al. 2010; Dollinger et al. 2008; Nonis and Hudson 2006; Schuman et al. 1985). A meta-analysis on class attendance and student performance in higher education in the USA has shown that class attendance predicts almost 20% of unique variance in college grade point average (GPA) over standardized achievement scores and 13% of unique variance over high school GPA (Credé et al. 2010). It is unclear, however, to what extent prior research on the flipped classroom was conducted with mandatory class attendance (Dove 2013; Mason et al. 2013; McLaughlin et al. 2013; Tune et al. 2013). Therefore, lecture attendance, which was not mandatory, was also taken into account in the present study.

Student regulation of learning

The motivation of students is considered an important prerequisite for their ability to regulate their learning process. It is beyond the scope of this paper to extensively review theories of motivation and self-determination (Deci et al. 1991; Niemiec and Ryan 2009), and self-regulation (Zimmerman 1990). Typically, self-regulated learning was seen as a trait that some students possess and those who score high on self-regulation perform well in any educational context regardless of how it is designed. Other research has focused on the study skills employed by students and the potential of training study skills in order to improve student performance (Hattie et al. 1996). From this perspective, students who master study skills will be able to regulate their learning because they possess the skills to learn in an effective manner. The demands of varying learning environments, however, are disregarded.

A line of research in which the learning environment has been included is that of the learning orientations (Vermunt and Vermetten 2004). From this perspective, students’ regulation of learning can take on different typologies that are to some degree stable as an individual trait but also subject to change in different educational contexts. Furthermore, Vermunt and Verloop (1999) recognized that the degree of student regulation and varying degrees of external regulation depending on specific educational contexts can affect the learning process. Vermunt and Vermetten (2004) noted that ‘Especially when students enter a new type of education, there may be a temporary misfit, or friction, between the students’ learning conceptions, orientations, and strategies, and the demands of the new learning environment (p. 280)’. This seems especially relevant in the context of implementing the flipped classroom, where what normally happens in the lecture must now be done by students and what students normally do at home is done during the lecture.

Research questions

The primary goal of the present study was to explore student study behaviour throughout a flipped and a regular course. Based on the literature discussed above, two main research questions were formulated: (1) How do students study throughout a flipped and a regular course? and (2) To what extent is study behaviour in a flipped and a regular course related to student performance? These main questions were investigated using quantitative data, and to complement these findings with insights from qualitative data, a third exploratory question was formulated: To what extent did students in the flipped course refer to regulating their learning in the course evaluations?



The students’ study behaviour was investigated in both a flipped and a regular course on introductory statistics. The students in the flipped course were enrolled in the pedagogical science major, and the students in the regular course were enrolled in the psychology major. While these are different groups of students, they are the most similar groups possible in terms of size and composition. Since a cohort-comparison design was not feasible, the present design was the fairest comparison possible in an ecologically valid setting. A total of 205 students completed the flipped course, and 295 completed the regular course.

Course design

For students in both the flipped and the regular course, this was their second introductory course on descriptive and inferential statistics. The courses were taught in the first half of the second semester, February–April 2014, and covered the same material using the same book. Both courses had a different instructor, and these instructors had taught the first introductory statistics course to the same group of students in the previous semester. In prior years, the instructors had worked together to develop the curriculum.

Students of both courses were required to participate in 7 mandatory practical meetings (in groups of about 20 students), for which they had to complete homework. The content of the practical meeting and that of the homework assignments were almost identical, with document analysis revealing an average of 80% exact overlap across weeks. Sufficient practical meeting attendance and handing in the homework on time was a prerequisite for being allowed to participate in the final exam. The score on the final exam then determined the students’ final grade, so the exam counted 100%. Lecture attendance was not mandatory in either course. The regular course consisted of 7 lectures, whereas the flipped classroom course consisted of 13 lectures, a difference that was also present between the courses before the flipped classroom was implemented.

As part of the flipped classroom design, students in this course had the opportunity to view a 15-min lecture-preview video. Furthermore, each student was required to hand in at least one question about the material to be covered during the lecture, for at least 8 out of 13 lectures, before the lecture took place. During the lecture, the students were presented with problems (multiple choice questions). First, the students were asked to provide an answer to the question themselves and use their smartphone or laptop to answer the question. Next, the students were given time to discuss the answers to the question with peers and, again, answer the question using their smartphone or laptop. Subsequently, the lecturer discussed the answers to the question, also referring to the questions that were sent in by the students prior to the lecture.

Materials and procedure

Study behaviour

The students were invited to fill out an online diary of their study behaviour on Mondays and Fridays throughout the course. On Friday, the students were asked to report on their study activities from the previous Monday through Thursday, and on Mondays, they were asked to report on their study activities from the previous Friday through Sunday. For each day, the online diary contained three questions: ‘did you study for statistics last {Monday, Tuesday, Wednesday, Thursday}?’ If the answer to this question was no, the diary skipped to the next day. If the answer to this question was yes, the following question was ‘which of the following activities did you conduct on Monday … ?’ and the subsequent question asked the students to indicate how much time was spent on those activities selected in the prior step.

In collaboration with the course lecturers, the following study activities were included in the diary for the regular course: reading the material, summarizing the material, working on homework, completing practice (exam) questions, receiving extra tutoring, and ‘other’ (which could be specified by the students). For the flipped course, the topics were identical, with one extra topic, namely watching the online video lectures. For all the activities, the students could select time slots of 15 min, ranging from 15 min to 5 h (20 options). The diary was designed using Qualtrics (; see Online Resource 1 for an example of the behaviour diary used in the flipped course. As an incentive for structural participation in the online diaries, 20 gift vouchers were raffled among the students who had responded to at least 80% of the diaries both halfway and at the end of the course.

Lecture attendance

During each lecture, the students of both courses were invited to place a check next to their name on a list to indicate presence. It was made clear to the students that checking their attendance was for the purpose of research and was in no way related to assessment in the course. Halfway through the course and at the end of the course, lecture attendance by student number was published on the course website inviting the students to check its accuracy and contact the researcher with corrections. Across both courses, 14 students notified the researcher with corrections in the lecture attendance list.

Student performance

The final exam consisted of 30 multiple choice questions for the regular course and 36 multiple choice questions for the flipped course. Of these tests, 28 questions were the same for both courses. Therefore, the number correct out of these 28 overlapping questions was used as the measure of student performance in this study.

Student evaluations of the flipped course

In accordance with university policy, anonymous course evaluation forms were handed out during the final exam and collected as the students left. The institutional course evaluations contained three open questions that were formulated as follows: (1) How could this course be improved? (2) What were you most satisfied with in the course? and (3) What did you learn most by following this course?


In order to answer the first research question (How did students study throughout a flipped and a regular course?), the number of days studied, total time studied, and total time spent on specific learning activities were computed separately for each week of the course. In these measures, respondents who did not complete both diaries for a particular week were excluded. The patterns for the specific activities tutoring and the open category other were excluded from the analyses due to the scarcity of occurrence.

In order to answer the second research question (To what extent is study behaviour in a flipped and a regular course related to student performance?), the total number of days studied, the total time studied, and the total time spent on different learning activities were computed over the entire course for every respondent. No respondents were excluded from these analyses, and the totals were divided by the number of days a respondent had participated for comparable scaling. Multiple regression was used to investigate whether the amount of time spent studying, the number of days studied, and the number of lectures attended explained variance in student performance. Weighted least squares regression (cf. Draper and Smith 2014, Ch. 9) was used where the weight attached to each individual’s response was proportional to the number of diaries completed. This way, the respondents who completed more diaries provided more information towards the regression model.

In order to answer the third research question (To what extent did students in the flipped course refer to the process of regulating their learning in the course evaluations?), analysis of the course evaluations began by reading and re-reading student responses, in search of elaborations relating to the regulation of the learning process. Examining the course evaluations in this way can be considered a deductive approach to thematic content analysis (Burnard et al. 2008; Elo and Kyngäs 2008). Due to the focus of the research question in the present study, evaluations with affective statements (like or dislike X) or opinions that were explained without reference to the learning process were excluded from the analysis.


Response rates

Response rates for the bi-weekly diaries for both courses are depicted in Fig. 1. This figure shows that the response rate initially and throughout the course was larger for the students in the flipped course. A total of 78 students (26%) in the regular and 98 students (48%) in the flipped course completed at least one out of 19 bi-weekly diaries. For respondents of the first diary in the regular course, the mean age was 19.5 (SD = 1.3), with 16% males, and for respondents of the first diary in the flipped course, the mean age was 19.4 (SD = 1.6) with 2% male in the flipped course. An average of 11 diaries were completed by respondents in both the flipped and the regular course. However, the distribution of the number of completed diaries differed, with 24% of respondents in the regular course completing one and 8% completing all study behaviour diaries. In contrast, for the flipped course, 3% of the respondents completed one while 17% completed all diaries. Students who completed at least one study behaviour diary had an average of one more question correct on the final exam compared to the students who never completed a diary (see Table 1).

Fig. 1
figure 1

Response rates on the bi-weekly diaries for a flipped and a regular course

Table 1 Student performance in the flipped and the regular course compared between respondents and non-respondents

How did the students study throughout a flipped and a regular course?

Figure 2 shows how many days and how much time the students spent studying each week throughout the course. The students spread their studying for statistics over 1 to 3 days each week, while in the last week before the exam, the students spread their studying over 4–5 days. Furthermore, Fig. 2 shows that the students spent no more than about 2 to 4 h studying per week throughout the course and in the last week 12–16 h on average. In the first 2 weeks, the students in the flipped course spent more time studying, whereas in weeks 8 and 9, the students in the flipped classroom spent less time studying compared to the regular course. Overall, the students’ study behaviour in terms of the days spent studying and hours spent studying was rather similar in both courses. The students in the regular course who responded to the study behaviour diaries attended about 4 out of 7 (57%) lectures on average, and the students in the flipped course who responded to the study behaviour diaries attended 8 out of 13 (61%) lectures on average.

Fig. 2
figure 2

Total amount of time and number of days studied each week throughout the course with 95% confidence intervals. Note: weeks 9 and 10 were combined for the number of days studied

The students did not spend more than 2 h each week on average reading the course material but spent 4–5 h on average reading the course material in the week and a half before the exam (see Fig. 3). The students in both the regular and the flipped classroom spent less than 1 h per week summarizing the material and studying the lecture slides. In the week and a half before the exam, the students spent about 4–6 h studying the lecture slides and about 7 h studying or making a summary. Throughout the course, the students spent less than 1 h per week practising the material, but in the week and a half leading up to the exam, the students spent about 12 h on average practising the material. For the amount of time spent on homework, Fig. 3 shows a dip in week 4 which can be explained by the fact that there was no required homework that week. Figure 3 shows that respondents in the regular course spent more time reading and practising in about week 8 of the course. Overall, there do not appear to be many clear differences between the flipped and regular courses in how the students studied.

Fig. 3
figure 3

Amount of time spent on different study activities throughout the course with 95% confidence intervals

How was study behaviour in a flipped and a regular course related to student performance?

Table 2 shows the correlations between student performance, the spread of study behaviour in number of days, the total amount of time spent studying, and lecture attendance. In both courses, the correlation between student performance and lecture attendance was strongest, but still fairly small (r = 0.23). A multiple regression model with the predictors number of days, total time, and number of lectures attended did not explain a substantial amount of variance in student performance for either a flipped or a regular course (flipped course: R 2 = .07, R 2 adj = .04, F(3, 94) = 2.5, p = .07; regular course: R 2 = .02, R 2 adj < .01, F(3, 73) = 0.33, p = .81). The variance inflation factors did not show problems with multicollinearity (see Table 3 for more details).

Table 2 Correlations between student performance and study behaviour (with 95% confidence intervals computed using Fisher’s Z transformation)
Table 3 Multiple regression results for student performance in the flipped and the regular course weighted by the number of days a diary was completed

The relationship between student performance and specific study activities was also explored (see Table 4). With the exception of practising in the flipped course (r = .25), none of the other study activities showed a statistically significant (α = .05) relationship with student performance. The correlations between study activities and student performance in the regular course were small and not statistically significant. Furthermore, the correlations between study activities did not appear to indicate multicollinearity in both courses, and the largest correlation was found between practising and making a summary in the regular course (r = .41).

Table 4 Correlations between student performance and study activities (with 95% confidence intervals computed using Fisher’s Z transformation)

To what extent did the students refer to regulating their learning in course evaluations?

The course evaluations were completed by 173 (84%) of the students in the flipped course. Of the students who responded, 58 evaluations contained elaborations that referred to the learning regulation process. Of these evaluations, three contained elaborations referring to different aspects of the regulation process, leading to 61 comments that were split into six themes to best reflect the content of the different comments relating to the learning process. Table 5 shows that themes reflecting a positive experience in the regulation of learning (video lecture-supported learning, participation in lecture-supported learning, and procrastination prevented) were outnumbered by the amount of students with negative experiences in the regulation of learning (more student regulation desired, more passive explanation desired, and student regulation necessary to benefit from lecture). See Table 5 for an example comment related to each theme and the discussion for implications these themes have for implementing the flipped classroom.

Table 5 Themes that emerged from student evaluations that referred to the regulation of the learning process

In reading and re-reading the course evaluations in search of references to learning regulation, two other themes also emerged pertaining to the students’ experience with the flipped environment as a whole. While these did not directly answer the research question, they do contribute to understanding how students coped with the change in the learning environment as a result of implementing the flipped course. The first additional theme was that students (n = 32) referred back to the design of the previous introductory statistics course to indicate how they felt the course should be designed. For example, one student remarked, ‘I liked the way it was in Statistics 1 because it gave me a better understanding of the material’. Secondly, several students (n = 6) demonstrated particular beliefs about who would benefit from the flipped design. A typical remark was ‘I think this method works well for students who are able to learn statistics easily’.


The aim of this study was to explore study behaviour throughout a course, the extent to which study behaviour was related to student performance, and how students evaluated their ability to regulate their learning in the flipped course. By studying the time and activities of students throughout a course, student engagement was operationalized differently in the present study compared to other research on student engagement (Kahu 2013). There was no clear evidence from study behaviour throughout the course that the students in the flipped course had a different study pattern compared to those from the regular course. The general pattern in both courses showed some study behaviour throughout, and a strong peak in study behaviour in the last week before the exam. This was also the case for the flipped course, where it could be seen that the students mostly reported watching video lectures right before the exam. In contrast, the study by Tune et al. (2013) found that students reported watching 75–100% of the video lectures. With this type of retrospective behavioural question, practitioners may incorrectly conclude that students complied with the change in study behaviour asked in a flipped classroom. Thus, the approach used to study the pattern of students’ study behaviour in the present study may be promising for further research into the success of implementing flipped classrooms.

The meta-analysis of Freeman et al. (2014) and much research on the flipped classroom have compared student performance in different learning environments (Davies et al. 2013; Jensen et al. 2015; Mason et al. 2013; McLaughlin et al. 2013; Pierce and Fox 2012, Street et al. 2015; Tune et al. 2013;). While student performance can be compared between the flipped and the regular course for the present study using the information in Table 1, the fairness of this comparison can be disputed due to differences in student populations, lecturers, and course design. This is a common problem for studies conducted in the real world instead of in a controlled lab setting. Instead, the focus in the present study was on the relationship between study behaviour and student performance in a flipped and a regular course. Tomes et al. (2011) found that active learning activities such as self-testing and practising behaviour were more related to student performance than passive strategies such as reading. Although there was also a statistically significant relationship for the flipped course between practising and student performance in the present study, other correlations between study behaviour and the final exam were very small. An important difference between the present study and that of Tomes et al. (2011), however, is that they only examined study behaviour in the 10 days right before the exam. Given the expected change in students’ study behaviour when implementing the flipped classroom, it is especially important to investigate the study behaviour throughout a course rather than only at the end of a course.

This present study is one of few to use qualitative methods to investigate student experiences with the flipped classroom. This is in line with the call for more qualitative research on student behaviour in the flipped classroom by Abeysekera and Dawson (2015). While the institutional evaluation did not contain questions about the regulation of the learning process, or even about the flipped design of the course specifically, they gave some insight into how students thought about their ability to regulate their learning in the flipped course. Six themes emerged that showed both congruence and friction in how students were able to regulate their learning in relation to the specific demands of the flipped course (Vermunt and Vermetten 2004). The first three codes in Table 5, video lecture support learning, participation in lecture supports learning, and procrastination prevented, can be interpreted as evidence for congruence. With the video lecture, the students were able to take in the information at their own pace with the ability to pause and rewind. This can be very beneficial to the learning process compared to a lecture where all students are subject to the same pace determined by the lecturer. The interactive nature of the lecture in the flipped course was recognized to contribute to better understanding, and several students also recognized how the whole of the flipped course helped them stay engaged throughout the course. These themes demonstrate the powerful potential of the flipped classroom to contribute to the learning process.

Themes also emerged, however, that showed friction between student regulation of learning and the design of the flipped course. Three themes, more student regulation desired, more passive explanation desired, and student regulation necessary to benefit from the lecture, showed how students struggled with regulating their learning in the flipped course. Students expected the freedom to regulate their learning in their own way and time outside the lecture. Thus, having to hand in questions about the material and prepare for the lectures each week did not always fit with their own agenda. Furthermore, a number of students indicated that they felt that the lack of passive explanation in the lectures impaired their learning process. The number of students who mentioned wanting more passive explanation was three times that of the number of students who mentioned appreciating active participation in the lecture. Given that active-learning lectures have been found to lead to better student performance (Freeman et al. 2014), this shows a conflict between what students want and what works. This demonstrates a serious challenge to educators aiming to implement not only the flipped classroom but also other forms of active lectures. Thirdly, a number of students recognized that it was necessary to prepare in advance of the lecture in order to benefit from the lecture. The friction in this theme is that while recognizing the potential of the lecture to support their learning, their own lack of ability to regulate their learning accordingly prevented the lecture from supporting their learning process.

The mix of themes showing both congruence and friction may explain why the pattern of study behaviour in the flipped and the regular course, as explored in the first research question, did not appear to differ. The two additional themes that emerged also corroborate this. The students found it hard to deal with the fact that the learning environment changed compared to their prior experience. This could have led to students’ refusal to comply with the desired change in study behaviour that is asked of students in a flipped course. The second additional theme showed that some students did not believe they could benefit from the flipped course design. These beliefs may have also prevented students from trying to change their study behaviour in such a way that they could benefit from the flipped course.

Although the student evaluations did not refer to the mandatory practical meetings, they could be an alternative explanation for the similarities between the study patterns in the flipped and the regular course. By the presence of these additional required practical meetings, the courses may have already had sufficient active learning elements, leading to a limited added value of implementing the flipped classroom as could be expected based on Freeman et al. (2014). Though Jensen et al. (2015) did not investigate study behaviour, they did not find differences in student performance between students of a flipped active learning course and those of a non-flipped active learning course.


The present study and that of Tomes et al. (2011) are rare cases in which students’ study behaviour was investigated using diaries. A limitation of this approach is that behavioural diaries can take on many different forms, which makes it harder to use a validated instrument. The advantage of this approach, however, is that a diary can be tailored to specific contexts and research questions. As such, it was an appropriate method to investigate how students studied throughout a course in the present study.

A second limitation of using behaviour diaries was the burden on respondents. Daily measurement as in Tomes et al. (2011) would have been an extraordinary burden on students; therefore, the present study used bi-weekly diaries. Nevertheless, response was rather low despite the participation incentive. While dropout is not uncommon in longitudinal research (Lugtig 2014), it may harm the representativeness of the conclusions. More research is necessary on how to measure student study behaviour in an optimal way, while keeping the burden of participation low. It is important for this methodological research to continue to take the measurement error of self-reported behaviour into account (Lugtig et al. 2015). Despite these limitations, using diaries is a promising way to move away from general questionnaires with Likert scales that measure trait-like attributes towards longitudinal studies that measure actual behaviour in the context of different learning environments.

A limitation of the use of institutional course evaluations was that student perspectives on regulating their learning in a flipped classroom environment were not directly targeted in the questions. Further qualitative research focused on students’ regulation of the learning process in a flipped environment may yield more and other themes than found in the present study. Since the course evaluations were anonymous, they could not be linked to students’ reported study behaviour throughout the course. Therefore, further research needs to be conducted on how student perceptions about their ability to regulate learning impact their study behaviour. More research could help determine if there was a difference in the relationship between study behaviour and student performance between students who showed evidence of congruence and those who showed evidence of friction.

Practical implications

This study reports the results of a localized intervention, as proposed by Abeysekera and Dawson (2015), but several recommendations for implementing the flipped classroom in large courses may be helpful for practitioners in other institutions:

Consider prior history between the lecturer and the students

In this study, students following the flipped course already had a history with the lecturer. This led to expectations about how the course would be taught and how students would need to study during the course. Therefore, when implementing a flipped course for a large group of students already familiar with the lecturer, more effort may be needed to help students adapt to change. Prior history between a lecturer and students, however, may also be conducive to adapting to change particularly in courses with few numbers of students as the distance between the lecturer and the student may be perceived as a lot smaller.

Consider the broader academic context in which the flipped course is implemented

Depending on how other courses in the higher education curriculum are designed, it may be more or less evident to students what type of study behaviour is expected in a flipped classroom and what the advantages of a flipped design are. In the present study, there were no other courses implemented as a flipped course in the curriculum, and while some students clearly saw the benefits and intentions of the design, it was not so evident for many others.

Expectation communication

Especially when teaching large groups of students, it may be difficult to gauge student beliefs about the effectiveness of a particular course design and to pinpoint when students experience friction that is destructive to the learning process. A teacher can, however, address why the flipped course can be beneficial for their learning process and how students can benefit most. In courses with large numbers of students where attendance is not mandatory, the class composition may differ each lecture. In such cases, it may be necessary to address the benefits and potential of the flipped classroom on a regular basis, to help more students recognize if they need to change their study behaviour.


It is important to recognize that when the flipped classroom is implemented, the demands of the learning environment change for students. The present research is one of the first to actually consider students’ study behaviour throughout a course in which the flipped classroom is implemented. The results do not suggest that implementing a flipped classroom is a quick fix that leads to improved student performance. Some students may benefit from the flipped design, but it may also be a source of frustration for others. More research is necessary to understand when and why implementing the flipped classroom is successful.