Keywords

1 Introduction

The society in which we live is continuously changing and we should be prepared for the opportunities that it brings to us. In the last 30 years, individuals have been involved in the digital revolution [1]. The emergence of new devices, new technologies, new services, new business, even new interaction ways should be associated to changes in learning paradigms [2,3,4]. It is necessary to teach students to understand this digital society and help them to be active and efficient in this context. That is, students should acquire new competences related to their current landscape, where they should properly use technology, make decisions, work in teams, solve problems, etc.

In order to facilitate this, learning processes should be focused in learners and take into account that they are digital natives, and they are used to technology, to new media contents, etc. [5, 6]; and also that they are learning not only in institutional contexts [7,8,9]. Given this context it is necessary to find new educational approaches that increase students’ motivation and engagement and help them to develop useful competences for the digital society.

One of these approaches is Challenge Based Learning (CBL). It encourages students to leverage the technology they use in their daily lives to solve real-world problems [10]. CBL is collaborative and involves not only students or teachers, but also other experts in specific fields. It works posing to students a big idea, they should discuss about it and define some main questions about this idea, from these questions a challenge is proposed. Students should address the challenge looking for a collaborative solution that involves their peers, teachers, experts, etc. After this, the solution, will be assessed [11].

The easiest way to assess a CBL project would be evaluating only the final result. However, in this way it is not possible to assess what each student involved in the project has done. Other relevant issues to analyze could be the partial results and the interaction among students and the other stakeholders of this project [12].

The analysis could be done by applying new educational disciplines, such as educational data mining, academic analytics or learning analytics, that offer different but convergent perspectives, methodologies, techniques and tools aiming to facilitate this transformation process [13]. But what is the aim of these disciplines? Educational data mining includes a series of techniques oriented to extraction of educational data through statistical machine learning and data-mining algorithms, for analysis and solution of educational research issues [14]. Academic analytics takes a different approach, focusing on the analysis of institutional data about students; therefore, it has a stronger focus on institutional policy decision making [15]. Finally, the main goal of learning analytics is “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” [16].

There is a wide choice of learning analytics (LA) tools with different aims and that can be applied in different context [17]. However, the tools employed in CBL approaches could be very heterogeneous. CBL could use a collaborative environment to facilitate the interaction between team members and also a great diversity of tools to solve the challenges [11]. Moreover, depending on the type of challenge the interaction can be carried out in different ways. For instance, the tools employed in a robotic challenge are not the same than the employed in a biology challenge. Given this context, the present paper aims to explore learning analytics tools and methodologies to be applied in CBL. The paper presents several possible scenarios and develop an example that uses a collaborative methodology to address the challenge and an ad-hoc defined LA tool to assess what has done each team member.

The rest of the paper is structured as follows: Sect. 2, describes the advantages and problems of CBL experiments and the tools employed; Sect. 3, presents what kind of tools can be applied depending on the experiments; Sect. 4 presents an example of application of a LA tool and finally (in Sect. 5) some conclusions are posed.

2 Challenge Based Learning Experiments

CBL has been applied successfully in different experiments [10, 11, 18,19,20,21,22,23] benefiting students in several ways such as [24]:

  • Students achieve a deeper understanding of different topics, learn how to diagnose and define problems before proposing solutions, and how to develop their creativity.

  • Students are involved in the definition of the addressed problem and in the solution process.

  • Students are aware of the problem, they develop a research process and define and implement models to address it. In order to do so they work in a collaborative way in teams with peers from different disciplines.

  • Students get closer to their community reality and contact and establish relationships with experts who may contribute to their professional development.

  • Students strengthen the connection between what they learn in school and what they perceive of the world around them.

  • Students develop high level communication skills through the use of social tools and media production techniques. With them they can create and share the solutions that they have developed.

However, previous experiments have also shown several limitations in the application of CBL:

  • Global projects are often away from the specific contents of academic subjects [20].

  • Traditional assessment systems can be a problem for students, because they may be more focused on assessments than on learning [25].

  • Most of the CBL experiments cannot be easily associated to a specific subject in academic contexts. They used to be applied to CBL specific designed subjects or to master projects [22].

  • Students’ perception about this approach is not clear because not all the experiments have indicators to evaluate this [23].

  • The participation of people with different roles may cause difficulties for students that should adapt their way to work to this situation [20].

  • The results of the global projects are typically obtained when the academic year has finished [18].

  • There is wide choice of tools to use in CBL experiences so evaluation is not easy [24].

We are going to explore this last limitation. CBL methodology is supported by the technology which helps student to develop their projects, contact with experts, public their results and maintain the level of engagement throughout the project [10]. From the review of previously mentioned CBL experiments it is possible to see that different tools can be employed during the experiments. These could be classified in:

  • Tools for information access. One of the requirements in most of the reviewed experiments is that the students should be able to access to the Internet information both at the classroom and at home [10, 11].

  • Tools for editing and publishing contents. In several of analyzed experiments students should generate outcomes such as video, or audio, HTML resources [10, 11], Power Points documents [21], etc. This means that students need tools that facilitate the production of that kind outcomes.

  • Tools for publishing evidences of what was done. This is a possible way to facilitate formative assessment of students outcomes [10, 24], something necessary to know what each group has been doing. This means that tools that facilitate tracking the results achieved are necessary. There are several ways to do this, a ePortfolio may be applied so the student can publish their partial outcomes [24]; meetings with each group in several moments of the development of the challenge [10]; description of the work in a wiki [18], etc.

  • Tools to facilitate the collaboration and communication of the stakeholders involved in the challenges. A shared working space is helpful for a successful challenge. The workspace should be available to students 24/7, include needed resources, access to activities, a calendar, and serve as a communication channel with the teacher and between team members. There are a wide variety of Web 2.0 resources available for project management and collaboration [11, 22, 24].

  • Dedicated tools for specific fields. Some of the challenges are applied in very specific fields and they required that the students and teachers use ad-hoc defined tools, for instance an earthquake simulator or a robot. This means that they should have access to tools that help them to carry out so specific works.

  • Learning environments or ecosystems. It is possible to group the above-mentioned tools in a single platform which would facilitate students access to the tools and teachers’ assessment of learning evidences [18, 26,27,28].

3 Possible Applications of Learning Analytics Tools to CBL Experiences

Given the previous classification of tools applied in CBL experiments, and taking into account the information that can be stored and how it can be accessed, different possible LA tools could be considered.

Regarding the tools for accessing the information, there are many options. Students can use web browsers or other specific tools to access to information; they can also read tweets, blogs, forums, research papers, etc. There are tools that allow recording these actions. However, students may access to the information not only to learn but with other aims. Application of LA tools in this case is difficult. There are tools such as google analytics that can track students’ interactions [29] for a specific web, something that cannot be enough in a CBL experiment where the students are using lot of different webs. It is also possible to analyze students’ navigation through the browser [30, 31] but it is not easy to know who is navigating and if students are being accessed to the contents with learning proposes. Social networks can be also analyzed [32, 33], but again it is necessary an authentication process and to know what social networks are used and the with which aim. Taking this into account, it is clear that tracking students’ information access can be very valuable, but in order to know how these activities impact learning processes it is necessary to channel them through a single platform (this will be described below).

It is also possible to analyze what students are doing for editing contents if they use a centralized online editing tool, a content repository [34] or a version control system [35]. This is because these systems provide monitoring capabilities. Depending on the tool applied it is possible to see what each student has been doing during a specific challenge (version control system, repository) or just the final outcome.

Regarding the tools for publishing outcomes, it is possible to apply different LA techniques, but the problem is again the variety of tools that the student can choose to do this. If there is not a centralized publication/learning tool, the students may use a forum, a blog, a wiki, a social network, a web, etc. In these cases, the best option is to define a specific tool to publish the outcomes and apply monitoring tools to it. In this way it would be possible to analyze students’ interactions (when they have uploaded their works, the size, number of files, number of attempts) or even the contents uploaded (by applying, for instance, text mining techniques [36]).

The evaluation of students’ interaction is easier in the collaboration tools. However, the problem is again the same than in previous samples, there are lot of tools for collaborating with peers. Most collaboration platforms provide tracking systems and dashboard. For instance Moodle provides forums, chats or messages to facilitate students interaction and Moodle analytics components facilitates tracking this activities [13]. However sometimes it is necessary to explore specific issues which requires the definition of ad-hoc LA tools [37, 38]. It would be desirable that all students use the same tools to collaborate in CBL projects.

The analysis of learning evidences when we apply tools for a specific context could also require of the development of specific tools. For instance if students use an ad-hoc defined simulator or a game [19] it would be necessary that this tool includes a monitoring systems. However, it should be noted that not all the activity that students carry out in a challenge can be easily monitored, especially when talking about ad-hoc defined tools. For instance, if students use Arduino kits during a challenge it could be possible to record with video cameras the building process and take this into account when evaluating the final, but the application of an LA tool in this case would be difficult.

Last but not least it is necessary to take into account learning platforms that groups a set of tools that can be applied during challenges. In this case, if we are using well known platforms it is easy to find different learning analytics tools with different goals. For instance, if we use Moodle in a CBL experiment we can use, the analytics component to obtain general information, GISMO component to check frequency of access to certain contents or activities, Moodle engagement module to check possible dropout, and Gephy or VeLA tool to explore how students interact [13, 38,39,40]. When using a centralized environment, it is possible to know who uses a tool and how, but the students could not choose the tools they want to work in the project.

In the following section, we are going to describe a CBL experiment that uses, among other tools Moodle as learning platform.

4 Application of a LA in a CBL Experiment

This section describes how a LA tool is applied in a CBL experiment carried out in a Spanish University. First subsection describes how CBL is adapted and implemented, after this we describe the tools employed and finally the results obtained.

4.1 How CBL Is Applied

This experiment applies a model based on the integration of CBL and Challenge Based Instruction (CBI) [18]. The model is implemented in 5 main stages:

  • First stage consists of: (1a) the presentation of the model to explain where it is going to be applied and show results of previous experiments; (1b) the definition of the teams that will address a challenge; (1c) the description of general ideas and essential questions and the definition of the challenge; and (1d) the access to solutions defined in previous experiments.

  • Second stage consists of: (2a) the development of activities to deal with the project (activities related to teamwork competence) such as: map of responsibilities, scheduling, working rules, etc.; and (2b) the access to examples of this kind of activities carried out in previous experiments.

  • Third stage consists of: (3a) the execution of the work: doing research, working with external agents and handling technology (wikis, on-line storage, eLearning systems and editing and publishing videos); and (3b) the access to examples of the execution in previous experiments.

  • Forth stage consists of: (4a) the completion of the service or product, usually in a wiki, blog, social network or web page; (4b) the organization of the used documentation; and (4c) the production of videos.

  • Fifth stage consists of: (5a) the classification of existing repository resources and (5b) and the aggregation of new ones in the repository so they can be used in future experiments.

In parallel to the model stages formative and summative assessment are carried out.

Given this model an experiment was carried out with 169 of the 183 students enrolled in “Computer Science and Programming” of the Engineering of Energy Degree of the Technical University of Madrid. 28 teams were formed with an average number of 6 six members per team. Each team chooses a challenge in one of these four areas: academic life, learning, professional opportunities and knowledge about the degree. Each challenge aims to improve the subject or university context where it is develop [18].

The course duration is 60 h; 10 of them (distributed in 5 sessions) were employed to the application of the CBL&I model. During these sessions, the stages described above were developed including a formative assessment to evaluate the partial results were carried out in phases 3, 4 and 5.

A week after the last session, the teaching staff carried out a summative assessment that took into account the individual involvement of each team member, the results obtained and how they were developed. This assessment is carried out by applying CTMTC teamwork methodology [41]. It is supported by a LA system that allows individual tracking of team members’ work [42, 43].

4.2 Tools Employed for the Implementation

During CBL experiment three tools were used in order to address the challenge and define the solution and one to assess how each member has developed teamwork competence when solving the challenge.

Regarding the tools used to address the problem the main platform is the LMS Moodle. This LMS is applied because it is very popular; it includes lot of learning apps that can be applied for collaboration and publication of contents; and because it is used by the university where the experiment is carried out. The following Moodle tools were used:

  • The authentication system. Each student should have an associated user into Moodle. If they want to use the collaboration or publication tools the user should be first authenticated. In this way, all students’ activity will be recorded and stored by Moodle and later can be analyzed

  • Moodle Forums and Chats. These tools enable synchronous and asynchronous communication between the members of each team. Also in this case, all interaction will be stored so later may be analyzed. Figure 1 shows one of the groups with all the threads (personal information has been anonymized).

    Fig. 1.
    figure 1

    Forum threads for group GIE2-10

  • Moodle Wikis. This tool was used to publish the partial results of the activities carried out. Specially those related to the team competence acquisition. An example of a wiki is shown in Fig. 2. It presents the structure of the different pages that a team has defined in a Moodle to demonstrate how they develop the necessary phases to address a project as a team.

    Fig. 2.
    figure 2

    Moodle Wiki for group GIE1-15

For this CBL&I model a key issue is that students can access to the results of previous experiments and also classify and store their own outcomes. In order to do so a repository is used, the Collaborative Academic Resource Finder (Buscador de Recursos Académicos Colaborativos in Spanish) BRACO [44]. It consists of a Knowledge Management System (to which faculty and students can add content), an adaptive search engine (used by students and teachers to locate and identify resources) and a set of specific subsystems designed to support various academic activities. With this repository, each user can have her own distribution of contents and can choose the results shown. In addition, users can generate a portfolio with a selection of resources obtained during the search. Faculty can also organize the search outcomes as a list on a personalized webpage that students can see [34]. Figure 3 shows a searcher made in BRACO repository (http://www.e-braco.net/) that allows looking for contents by several criteria such as source, subject, area, thematic, author, etc.

Fig. 3.
figure 3

BRACO educational resources repository

In order to carry out the summative assessment it is necessary to take into account not only the final solution of the challenge but also the interaction between stakeholders that take place during the development of that solution. The final results can be easily reviewed because they can be available in a final deliverable, in the Internet, in the repository and/or in other online applications. However, the evaluation of the interactions is harder. Taking only into account the interactions related to teamwork development the analysis of the posts, threads and logs for all students in a group can last between 40 min and 1 h [37], without including the assessment. If we consider also the time for evaluation the estimated time per group could be around 3 h and 45 min [45]. If we think in a subject that has 8 students and 2 groups this is not critical. However, if the project, as in this case, has 28 groups, the work will last around 107 h. In order to solve this an ad-hoc Learning Analytics tool was developed and it is applied during assessment. The tool will allow to see the number of messages per forum, group or thread, and also the participation of each student, taking into account the number of short and long messages.

Finally, it is necessary to explore the tools used to produce the final result. In order to do this the students can freely choose what tools they use for editing videos, audios and publishing results, but the results should be accessible for all the involved stakeholders. A sample of these results is available on the following link: http://energytub.wixsite.com/energytub.

4.3 Results Obtained from the Application of the LA Tool

This section presents some of the results obtained during the application of CBL&I model, but it is specially focused in the results shown by the LA tool.

During the project 28 groups were involved, from them 24 were able to implement a real solution to the challenge, 4 failed because they do not carry out properly the tasks and due to a mismanagement of teamwork.

Regarding the interaction between team members there were a total 4684 messages for the 169 students, that is an average of 27.71 messages per user. In previous researches [42, 46], it is shown that a higher number of interactions is related to a better performance. Table 1 presents a summary of the interactions of each group and Fig. 4 shows a screenshot of the LA tool with the specific information for group GIE1-15.

Table 1. Distribution of messages by group
Fig. 4.
figure 4

Results of the LA tool for GIE1-15

The groups are organized in classes GIE1 and GIE2 and students can choose one of the available groups. This is the reason that some group numbers were not used. The first column shows the group name, the second the number of messages, the third the number of long messages and the forth the number of short messages. These last two columns allow the teacher to have some knowledge about the quality of interactions, if in a group most of the messages are short this mean that the interaction is more assertive and there is not a real discussion. In the table, it is possible to see that groups that failed to define de solution for the challenge were those with a lower number of interactions GIE1-03, GIE1-12, GIE1-14 and GIE2-01. However, it is not possible to make general assumptions in this sense because the interactions are not the only issue evaluated during the challenge. But with this kind of tool it is possible to have knowledge about the level of participation and engagement of each members of the teams.

5 Conclusions

CBL is a learning approach to teaching and learning that allows students use the technology they really use to solve real problems. This type of initiatives benefits students in different ways, making them closer to the real world and helping the acquisition of competences such as teamwork and communication skills.

The assessment of CBL should take into both the final results and the partial outcomes generated by the team members. However, evaluating only the results would mean to ignore other important issues that should be taken into account, such as the interaction between team members, or with other stakeholders implied in the development a solution to the challenge. The evaluation of this kind of interactions is usually difficult, because of it involves analyzing a great amount of information that is going to require a lot of time. In order to facilitate this analysis Learning Analytics tools could be applied. However, the question is if this kind of tools can be applied in CBL approaches, because CBL does not follow the typical structure of online learning courses and does not use the usual institutional tools.

In this paper, an analysis of the used tools in CBL is carried out. Taking into account these tools it is possible to assert that LA can be applied in CBL, although the way to do this and the performance of the LA techniques will depend on the tools choice to develop the challenge. After the analysis carried out it is possible to assert that it would be desirable to use a learning platform that groups the tools used by the student to develop the challenge. In this way, learning evidences can be easily recorded in a common place and with a defined data structure, which will facilitate further analysis.

As future works it would be desirable to replicate the experiment carried out in other contexts, with other learning platforms and learning analytics tools to support the conclusions obtained. In addition, it would be interesting the application of other learning tools during other stages of challenge solution development, for instance during the production of the final results.