Keywords

1 Introduction

The notion of learning analytics can be defined as the collection, analysis and reporting of data on students and their contexts for purposes of optimising learning and the environments in which it occurs [1]. This broad definition covers the majority of approaches to generating and analysing educational data that, of course, have various origins and scopes.

First, data on students’ learning outcomes and wellbeing include larger sets of machine-readable data amenable to statistical analyses. Since it would not be practical to deal with these data manually, automated analysis is often conducted by external stakeholders. For example, this learning analytics is organised by school authorities at national, regional and municipal levels, national statistical agencies, data warehouses or international organizations (e.g. OECD, IEA and WHO).

The emergence of very large datasets in connection with the ongoing explosion in the availability of many different types of data is often called big data. A shift towards a culture of using big data to improve student learning is still in its infancy. However, the Danish stakeholders will be “reporting about the performance of the school system to the public” more frequently in the future and, intentionally, “the reporting framework could form the basis for the periodic publication of key national analytical reports” [2].

Typically, the publication of such results and reports takes too long to provide information about current students in time to fully serve teachers in primary and secondary education. For example, statistical analyses based on summative evaluations and dropout rates from last year or previous years do not really support current learning activities. From the teachers’ point of view, the relevance of analytics based on exams, tests or surveys is lower when it regards previous students rather than students currently in the teachers’ classrooms. For formative purposes, teachers need real-time analytics regarding effort, persistence and behaviour of present students’ rather than of previous generations of students. To be proactive in their classrooms, teachers also need up-to-date analytics regarding current learning environments instead of previous environments.

Moreover, big data sets are often considered overwhelming by teachers and it is a challenge for them to make use of the patterns and trends [3]. For example, Danish teachers in primary and lower secondary education are encouraged by the Ministry of Education to make use of the results of national mandatory computer-based summative tests of all students. The tests are adaptive and items vary from student to student. Since many teachers have difficulties interpreting the tests’ results and applying them into future learning plans, these results are seldom being used systematically for formative purposes [4]. In addition, teachers in youth education do not consider it fair to search and use this type of data on student performance in lower secondary education because learning requirements and activities are different at the upper secondary level [5].

Another example concerns sets of data on student wellbeing, which are currently generated at all public schools in Denmark. Student wellbeing is not defined or operationalised properly [6], but data on this issue are analysed under the auspices of the Ministry of Education for accountability reasons. Even though the Ministry arranges courses for school representatives in this field, the learning analytics are seldom used for formative purposes and the results of this type of assessment seldom influence teachers’ planning of future teaching and learning activities [5].

The only exception from this general picture regards student’s literacy, i.e. their ability to understand and process oral and written language [7]. In Denmark, most public schools employ reading advisors who support literacy development and the use of text-to-speech software [8].

To make proactive advising possible at scale, results of previous tests are used at the upper secondary schools in combination with current reading and writing tests. For example, contemporary data from language, reading and writing tests are analysed for proactive advising purposes.

As already mentioned, educational data have various origins. Besides big data, teachers can generate and analyse smaller sets of data on their own. This teacher-driven learning analytics implies a transformation of the teachers’ role in relation to learning analytics, i.e. a shift from end-user to analyst of data. The latter role is the main topic of this paper that examines teacher-driven analytics of teaching and learning environments. The primary goal is to support teachers’ decision making and the research question is: How can teachers generate and analyse data to help improve their practice and the learning outcomes for their students?

There is research evidence suggesting that analyses of equally weighted data from assessments of student’s academic performance, teachers’ observations and student surveys produce valid and reliable results that correlate with student performance [9]. To get the most reliable results, data from national tests or assessments of students’ higher-order conceptual understanding are not sufficient. In other words, teachers have to focus on composite sets of data. To obtain the most valid, reliable and accurate analytical results, teacher-driven analytics should include learning outcomes, data from own observations in the learning environment and students’ self-reported perceptions of teaching and the learning environment [10].

Student surveys are easy to administer in digital learning environments and a relatively inexpensive way to supplement other sets of data. To survey something as broad as the teachers’ efforts and the learning environment requires breaking these issues down into various constructs that can be measured by simple questions [10]. For example, the students can be asked if they agree or disagree with statements like:

  • “My teacher has several good ways to explain each topic that we cover in this class”

  • “When I turn in my work, my teacher gives me useful feedback that helps me improve”.

Responding to such questions, students provide information on specific aspects of teaching and the learning environment, so that teachers can improve their use of time, the application of information and communications technology (ICT) into the learning environments, the provision of feedback and their relationships with their students if needed.

There is research evidence suggesting that the students’ responses demonstrate relative consistency, capture aspects of teaching and the learning environment that relate to desired learning outcomes and point to strengths and areas for improvement that the teachers value [9].

Teachers have to analyse data representing something indisputably important, including environmental facilitators and impediments [11]. Since summative assessments disengage some students, it is important to access students’ learning behaviour for formative purposes [12]. In particular, it is important to access students’ understanding of goal systems and outcome expectations [13].

Besides data on students’ learning behaviour, teachers often focus on motivation that, among other things, depends on students’ expectation and appreciation, i.e. their level of expectations before new learning activities and their level of appreciation during and after these activities [14]. Teachers can acquire knowledge in this field by analysing students’ perceived self-efficacy. Perceived self-efficacy is a theoretical construct that relates to students’ beliefs in their capabilities to solve given tasks and accomplish new tasks [15]. Items measuring these beliefs are closely related to students’ perceptions of their capabilities to successfully undertake the actions required to complete different tasks [16]. An example question is [17]:

  • “How confident do you feel about having to do the following mathematics task – calculating the gas mileage of a car?”.

The extent to which students believe in their own ability to handle tasks effectively influences their behaviour in educational settings. Their confidence in being able to solve such tasks correlates strongly with their test achievements in the PISA studies [18]. Compared with other factors, it is the one factor that correlates best with students’ achievement [19]. Consequently, perceived self-efficacy is a reliable non-cognitive indicator of students’ performance [13, 20].

Students who do not believe in their ability to accomplish new tasks might not exert the effort needed to complete their tasks successfully. Often, students with low perceived self-efficacy do not appreciate self-directed learning [18]. For example, on average 30% of 16-year-old students feel helpless doing problem solving in mathematics and some of them have ‘mathematics anxiety’ that is associated with a lower score to the equivalent of almost one year of school work [18].

In sum, teachers should give high priority to analyse factors that influence students’ perceived self-efficacy [21]. This paper presents results of research concerning the implementation of teacher-driven learning analytics in educational settings (here in Danish schools). In particular, it focuses on students’ self-efficacy as a parameter to be analysed together with their academic performance.

2 Research Design

To explore teacher-driven learning analytics, we applied a mixed method research design where data was generated in several ways. Initially, we generated data about the implementation of teacher-driven analytics in public schools. More than 32,000 teachers were involved in the largest research- and development program in Denmark so far. More than 500 focus group interviews were conducted to generate data on pros and cons of teacher-driven analytics at public schools. In addition, digital questionnaires to teachers and students were used to generate information about benefits of the teacher-driven analytics.

Afterwards, we asked digitally innovative teachers regarding the application of ICT into 1:1 learning environments at upper secondary schools in the Danish Central Jutland Region. In 2015, 127 teachers responded (response rate 95%). In 2016, 64 teachers responded (response rate 81%). In 2017, 14 teachers from case schools participated (response rate 93%).

In addition, we conducted semi-structured focus group interviews at the participating schools with representatives of these teachers. Major themes were teachers’ application of locally generated data and national indicators regarding student learning outcome, dropout rates and wellbeing.

Furthermore, we did a survey among students in youth education regarding learning experiences. In 2015, a total of 446 students from 25 classes at eight of the participating upper secondary schools answered questions about their teachers’ efforts applying ICT into the learning environments (response rate 76%). In 2016, the respondents were 221 students from 12 classes at nine upper secondary schools (response rate 81%). In 2017, the respondents were 131 students from six classes at five upper secondary schools (response rate 86%).

3 Teacher-Driven Learning Analytics

The comprehensive Danish research and development project represents proof of concept regarding learning analytics serving teachers. Usually, teachers are meeting every two or three weeks in order to generate and analyse data on persistent impediments in learning environments. On these occasions, they also draw on relevant theories and research results [22]. In line with general recommendations, they concentrate on specific situations that they can improve rather than on more general learning conditions that are out of reach [23]. At first, they identify what they can and want to improve. Then, they systematically make sense of the educational data and take action based on the results of their analyses to prevent persistent learning difficulties and raise students’ motivation. In other words, they make changes and gauge effects on the learning environment and student performance. For example a group of teachers would:

  • Identify some educational factors that persistently create and sustain problematic situations

  • Generate data on these factors

  • Analyse and comprehend these data

  • Identify some causes of the challenges

  • Decide which factor(s) to change

  • Plan how to implement this change

  • Take action.

The regular teacher-driven analytics corresponds to systematically formative evaluation, which generally increases students’ learning outcomes [24]. Our research provided evidence suggesting positive consequences of the data-driven approach because the groups of teachers act on their analysis, implementing important and necessary changes in the learning environments [22].

Since it takes time to systematically analyse facilitators and impediments in the learning environments, the available time sets a limit to the amount of data that teachers can generate and analyse. When schools, however, allocate additional time to teachers to build teams and analyse data closely related to students’ learning, the results can be very successful.

When the teachers systematically increase their understanding of factors that create and sustain challenges in the learning environments, they develop their own analytical competencies and professional identity [22]. They also develop common pedagogical terminology and practice [25]. In particular, such practice fosters ongoing community dialogue regarding educational challenges and a shared understanding regarding student progression [26]. In addition, it fosters reflections concerning these challenges that guide improved teaching practices [27].

This, in turn, enhances teachers’ professional wellbeing and strengthens their cooperation [22]. After 3½ years of teacher driven learning analytics, the teachers’ collaboration increased to 573 on a scale with a set average value of 500 and their wellbeing increased to 525 on the same scale [28].

In Denmark, teacher development has traditionally been conceived and aimed at building the capacity of the individual teacher to enable him/her to perform better in the learning environment. There is, however, research evidence suggesting that teacher development is less effective than similar group solutions [29]. Teachers are more likely to use what they have learned when they engage in “job-embedded learning featuring teacher collaboration and use of coaches” [30]. Likewise, this collaborative approach improves students’ learning [31]. In consequence, the better-performing countries in the world do not aim to have a few expert teachers at each school, but promote team-based solutions enabling teachers to claim ownership in shaping educational practice and to sustain improvement in the classrooms [29].

The Danish research and development project provides research evidence suggesting that teacher-driven learning analytics is feasible when organised as teamwork. Better than the individual teacher, teams of teachers master a data-driven approach, share knowledge about how students are learning and discuss other implications of data to support student learning [2]. This is in line with the theory of learning organisations, according to which teams of professionals build shared visions and engage in pro-active teams [32].

In the research literature, such teams are identified as affinity groups [33], study groups [34], professional networks [35] and professional learning communities [34]. Building affinity groups of teachers that meet regularly for the purposes of joint educational development enhance the collective capacity of teachers to create and pursue improved learning conditions for their students and it is considered one of the most successful methods of professional development [36].

4 Learning Analytics in 1:1 Environments

There is research evidence suggesting that teacher provided formative feedback influences the students’ learning outcomes [37]. The provision of formative feedback helps teachers to acquire an overview of the progression of learning and to enhance students’ learning outcomes and perceived self-efficacy. Danish teachers in youth education often use digital technology to assess students’ understanding and performance (Table 1). The respondents entered a number between 1 (representing ‘Not at all’) and 9 (representing ‘To a great extent’). On this scale, responses above “5” represent some or a higher degree of positive experience.

Table 1. Some results of teacher survey (2016).

The confidence interval is rather small. With 95% probability, the true value is contained within a range that is the specified answer ±0.2. For example, when an average response is 6.8, the true value is between 6.6 and 7.0.

Denmark is the first country in the world to implement 1:1 classrooms at all public schools [38] and digital competencies of students are among the highest in the world [39]. In general, students are self-reliant in the digital learning environments with a 1:1 ratio between students and digital units connected to the Internet. Consequently, teachers can regularly use ICT to provide information and feedback. Besides oral feedback, written digital feedback contributes to students’ perceived self-efficacy as well as their learning outcome (Table 2).

Table 2. Some results of student survey (2017)

The standard deviation of the responses is around 2. If an average response is 7.0, this means that the response of a number of students (approximately one sixth) is below 5. To develop and sustain an inclusive digital learning environment, teachers can thus identify and prevent impediments in the environment that disengage this particular group of students.

Teachers use online dialogues on a regular basis to positively influence the motivation of disengaged students. We did a survey in both 2015 and 2016 on the impact of this online dialogue between teachers and students on students’ perceived self-efficacy (Table 3).

Table 3. Some results of teacher surveys

The response distribution is stable over a short time. The questions with the highest response rates in 2015 also have the highest rates the year after (Tables 3 and 4). In general the responses were slightly higher in 2016, representing more positive learning experiences. This can be due to, among other things, the teachers’ competence development, the use of ICT and the students’ project work.

Table 4. Some results of student surveys

Whether and to what extent students expect to be able to cope with their educational tasks partially depends on requirements for their presentations and assignments. Therefore, teachers have to describe requirement and prose questions that are understandable for the students, so they become aware of what is expected in terms of assignments and presentations. Students’ beliefs about their capabilities to solve given tasks highly depend on teacher clarity at these occasions [25]. Teachers often use ICT to provide information about students’ assignments and the requirements in terms of digital products and/or oral presentations. In 1:1 environments, teachers have a wide repertoire of ways to do so, which students generally appreciate (Table 4).

As already mentioned, students’ learning outcome depends on how they assess their ability to meet the expectations set out at these occasions. Students who do not believe they can and will do well are less likely to be motivated for self-directed learning in terms of effort, persistence and behaviour than students who expect to succeed [40]. Whenever learning activities require reading and processing of texts, it generally reduces these beliefs and some students appreciate the use of multimodal learning materials including digital video that the teachers provide or the find on the Internet (Table 5).

Table 5. Some results of student survey (2017)

Our surveys indicate that the teachers most often value the same activities as the students. This includes regular use of video-based or multimodal materials. According to teachers, video-based materials and student products strengthen students’ outcome of school work and their perceived self-efficacy (Table 6).

Table 6. Some results of teacher survey (2017)

As already mentioned, students’ ability to understand and process oral and written content affects their development of other important competencies. With regard to regular use of text-based learning materials, there is quite a substantial difference in teachers’ and students’ responses. In the teachers’ view, it does not really strengthen the students’ belief in their own capacity. Many teachers consider literacy problems a learning barrier which inhibits students’ perceived self-efficacy.

5 Discussion

When teachers provide formative feedback, it can enhance students’ work in progress and belief in own abilities. Therefore, this kind of interaction influences not just their learning outcomes but also their perceived self-efficacy.

In general, it is a win-win situation when teachers evaluate and provide feedback to student drafts using ICT [25]. By assessing and commenting on student efforts, teachers gain an overview of the students’ progress of learning and perceived self-efficacy. In particular, they can obtain and analyse real-time data on students’ perceived self-efficacy and learning experiences, including their fulfilment of learning objectives and learning difficulties.

When teachers initiate new learning tasks, students can consider whether they will be able to solve these tasks successfully from the beginning. If the students believe that they are not able to solve the task successfully, they might expect some learning difficulties because self-efficacy correlates with learning outcome. Only indirectly, teachers experience perceived self-efficacy among students. Whenever teachers initiate new tasks, they thus have to find ways of obtaining information on this self-efficacy. For example, they can ask students to self-report their beliefs in their capabilities to accomplish their new task or series of tasks (Fig. 1).

Fig. 1.
figure 1

Real-time learning analytics regarding student self-efficacy and performance

When teachers want to guide and support students, they can use digital technology to retrieve information about current perceived self-efficacy. By means of simple digital questionnaires, teachers can easily survey students’ learning expectations and perceptions of their capabilities to successfully undertake the actions required to complete specific learning tasks. During learning activities, teachers can, among other things, use the results of these analyses to tailor their provision of feedback to the individual needs of the students.

Future studies can contribute to the development of digital standardised self-report questions regarding perceived self-efficacy that teachers can easily apply into 1:1 learning environments. For example, teachers can obtain and analyse real-time data including snapshots of perceived self-efficacy. These snapshots can be obtained at the beginning and during the students’ learning activities. As already mentioned, they often represent the most reliable non-cognitive indicator of students’ performance.

6 Conclusion

This paper provides evidence suggesting how teachers can generate and analyse data to help improve their practice and the learning outcomes for their students. The paper examines an approach adopted by some of the world’s most improved school systems implementing team based learning analytics. In particular, it builds on research on the implementation of teacher-driven learning analytics in Danish primary and secondary education.

When affinity groups of teachers systematically increase their understanding of factors that create and sustain challenges in the learning environments, they at the same time develop their own analytical competencies and professional identity. This practice fosters an ongoing community dialogue regarding educational challenges in the 1:1 classroom and reflects these challenges to guide teaching practices. In addition, collaborative teacher-driven learning analytics foster the development of a shared understanding regarding student progression. This, in turn, enhances the teachers’ professional well-being and cooperation.

The paper examines analyses of a theoretical construct: perceived self-efficacy that refers to students’ perceptions of their capabilities to successfully undertake the actions required to complete specific learning tasks. This construct is considered the most reliable indicator of students’ performance, i.e. the one that correlates best with students’ learning outcomes. When students are asked to complete tasks in the 1:1 classrooms, their perceived self-efficacy correlates with their expectations of being able to do it. The greater self-efficacy, the greater the confidence that a task will go well and the less fear that it fails. In youth education, teachers often utilise online dialogue to strengthen students’ perception of their ability to perform new learning tasks. When answering students’ questions, guiding them and providing feedback to them, teachers also foster this perceived self-efficacy.

By identifying and reducing low self-efficacy, teachers can increase students’ expectations regarding their learning outcomes. Using digital tools, teachers can obtain self-reported data regarding perceived self-efficacy. In particular, they can generate data on the students’ beliefs whether they can solve their imminent tasks. Based on results of analyses of these data, teachers can then develop and sustain inclusive 1:1 learning environments.

Future studies can contribute to the development of digital standardised self-report questions regarding perceived self-efficacy, which teachers can easily apply into 1:1 learning environments.