Skip to main content
Log in

Factors that affect the success of learning analytics dashboards

  • Development Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

A Correction to this article was published on 13 August 2019

This article has been updated

Abstract

A learning analytics dashboard enables teachers and students to monitor and reflect on their online teaching and learning patterns. This study was a review of prior studies on learning analytics dashboards to show the need to develop an instrument for measuring dashboard success. An early version of the instrument based on the framework of Kirkpatrick’s four levels of evaluation was revised through expert reviews and exploratory factor analysis. The instrument contains five criteria: visual attraction, usability, level of understanding, perceived usefulness, and behavioral changes. The validity of the instrument was subsequently tested with factor analysis. A total of 271 samples from students who utilized a learning analytics dashboard for one semester were collected and analyzed using structural equation modeling. In the model with fair fit, the visual attraction and usability of the dashboard significantly affected the level of understanding, and level of understanding affected perceived usefulness, which in turn significantly affected potential behavior changes. The findings of this study have implications for designers who want to develop successful learning analytics dashboards, and further research is suggested related to measuring the cross validity of the evaluation instrument to broaden its usage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Change history

  • 13 August 2019

    The Funding information provided in this article as published stands in need of correction. The correct information is: “This study was supported by research fund from Honam University, 2017”. Also note the current correct affiliation for author Yeonjeong Park: Department of Early Childhood Education, College of Humanities and Social Sciences, Honam University, Gwangju, South Korea

Notes

  1. Browne and Cudeck (1993) suggested guidelines for interpreting RMSEA: values in the range of .00 to .05 indicate close fit, those in the range between .05 and .08 indicate fair fit, and those between .08 and .10 indicate mediocre fit.

References

  • Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education,58(1), 470–489.

    Article  Google Scholar 

  • Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.

  • Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human–Computer Interaction,24(6), 574–594.

    Article  Google Scholar 

  • Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies,10(4), 405–418.

    Article  Google Scholar 

  • Brill, J., & Park, Y. (2011). Evaluating online tutorials for university faculty, staff, and students: The contribution of just-in-time online resources to learning and performance. International Journal on E-learning,10(1), 5–26.

    Google Scholar 

  • Browne, M., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & L. S. Long (Eds.), Testing structural equation models (pp. 136–162). Newbury Park: Sage.

    Google Scholar 

  • Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of educational research,65(3), 245–281.

    Article  Google Scholar 

  • Creswell, J. W. (2003). Research design. Thousand Oaks: Sage.

    Google Scholar 

  • Daniel, B. (2016). Big data and learning analytics in higher education. New York: Springer.

    Google Scholar 

  • Dawson, S., Bakharia, A., & Heathcote, E. (2010). SNAPP: Realising the affordances of real-time SNA within networked learning environments. Proceedings of the 7th International Conference on Networked Learning, pp. 125–133.

  • Dick, W., Carey, L., & Carey, J. O. (2005). The systematic design of instruction (6th ed.). Boston: Allyn and Bacom.

    Google Scholar 

  • Eckerson, W. W. (2010). Performance dashboards: Measuring, monitoring, and managing your business (2nd ed.). New York: Wiley.

    Google Scholar 

  • Endsley, M. R. (2012). Designing for situation awareness: An approach to user-centered design. Baco Raton: CRC Press.

    Google Scholar 

  • Essa, A., & Ayad, H. (2012). Student success system: risk analytics and data visualization using ensembles of predictive models. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.

  • Few, S. (2006). Information dashboard design. Newton: O’Reilly.

    Google Scholar 

  • Few, S. (2009). Now you see it: simple visualization techniques for quantitative analysis. Burlingame: Analytics Press.

    Google Scholar 

  • Few, S. (2012). Show me the numbers: Designing tables and graphs to enlighten. Burlingame: Analytics Press.

    Google Scholar 

  • Few, S. (2013). Information dashboard design: Displaying data for at-a-glance monitoring (2nd ed.). Burlingame: Analytics Press.

    Google Scholar 

  • Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. Paper presented at the CHI’12 Extended Abstracts on Human Factors in Computing Systems.

  • Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models (4th ed.). New York: ERIC Clearninghouse on Information and Technology.

    Google Scholar 

  • Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis: A global perspective (7th ed.). New Jersey: Pearson Education, Upper Saddle River.

    Google Scholar 

  • Horton, W. (2001). Evaluating e-learning. ASTD (American Society for Training & Development): ASTD.

    Google Scholar 

  • Jo, I., & Kim, J. (2013a). Investigation of statistically significant period for achievement prediction model in e-learning. Journal of Educational Technology,29(2), 285–306.

    Article  Google Scholar 

  • Jo, I., & Kim, Y. (2013b). Impact of learner's time management strategies on achievement in an e-learning environment: A learning analytics approach. Journal of Educational Information and Media,19(1), 83–107.

    Google Scholar 

  • Jo, I. H., Kim, D., & Yoon, M. (2014). Analyzing the log patterns of adult learners in LMS using learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 183–187). ACM.

  • Kahneman, D. (2011). Thinking, fast and slow. London: Macmillan.

    Google Scholar 

  • Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco: Berrett-Koehler.

    Google Scholar 

  • Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. Learning analytics (pp. 103–119). New York: Springer.

    Google Scholar 

  • Lambropoulos, N., Faulkner, X., & Culwin, F. (2012). Supporting social awareness in collaborative e-learning. British Journal of Educational Technology,43(2), 295–306.

    Article  Google Scholar 

  • Ledden, L., Kalafatis, S. P., & Samouel, P. (2007). The relationship between personal values and perceived value of education. Journal of Business Research,60(9), 965–974.

    Article  Google Scholar 

  • Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012). GLASS: A learning analytics visualization tool. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 162–163). ACM.

  • Mavroudi, A., Giannakos, M., & Krogstie, J. (2018). Supporting adaptive learning pathways through the use of learning analytics: developments, challenges and future opportunities. Interactive Learning Environments,26(2), 206–220.

    Article  Google Scholar 

  • Mazza, R., & Milani, C. (2004). Gismo: a graphical interactive student monitoring tool for course management systems. Paper presented at the Technology Enhanced Learning Conference, Milan.

  • Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures: Issues and applications. London: Sage.

    Book  Google Scholar 

  • Papamitsiou, Z., & Economides, A. A. (2016). Learning analytics for smart learning environments: A meta-analysis of empirical research results from 2009 to 2015. Learning, Design, and Technology. https://doi.org/10.1007/978-3-319-17727-4_15-1.

    Article  Google Scholar 

  • Park, Y., & Jo, I. (2015). Development of learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science,21(1), 110–133.

    Google Scholar 

  • Pedhazur, E. J., & Schmelkin, L. P. (2013). Measurement, design, and analysis: An integrated approach. London: Psychology Press.

    Book  Google Scholar 

  • Podgorelec, V., & Kuhar, S. (2011). Taking advantage of education data: Advanced data analysis and reporting in virtual learning environments. Electronics and Electrical Engineering,114(8), 111–116.

    Google Scholar 

  • Reeves, T. C., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., … Loh, S. (2002). Usability and instructional design heuristics for E-learning evaluation.

  • Romero, C., Espejo, P. G., Zafra, A., Romero, J. R., & Ventura, S. (2013). Web usage mining for predicting final marks of students that use Moodle courses. Computer Applications in Engineering Education,21(1), 135–146.

    Article  Google Scholar 

  • Santos, J. L., Govaerts, S., Verbert, K., & Duval, E. (2012). Goal-oriented visualizations of activity tracking: a case study with engineering students. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.

  • Santos, J. L., Verbert, K., Govaerts, S., & Duval, E. (2013). Addressing learner issues with StepUp!: an Evaluation. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.

  • Scheuer, O., & Zinn, C. (2007). How did the e-learning session go? The Student Inspector. Frontiers in Artifical Intelligence and Applications,158, 487.

    Google Scholar 

  • Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior,78, 397–407.

    Article  Google Scholar 

  • Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2018). Linking learning behavior analytics and learning science concepts: designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior. https://doi.org/10.1016/j.chb.2018.05.004.

    Article  Google Scholar 

  • Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New York: Springer.

    Google Scholar 

  • Upton, K., & Kay, J. (2009). Narcissus: group and individual models to support small group work. In F. Ricci (Ed.), User modeling, adaptation, and personalization (pp. 54–65). New York: Springer.

    Chapter  Google Scholar 

  • Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist,57(10), 1500–1509.

    Article  Google Scholar 

  • Wong, G. K. (2016). The behavioral intentions of Hong Kong primary teachers in adopting educational technology. Educational Technology Research and Development,64(2), 313–338.

    Article  Google Scholar 

  • Yoo, Y., Lee, H., Jo, I., & Park, Y. (2014). Educational dashboards for smart learning: Review of case studies. Paper presented at the International conference on smart learning enviornment 2014, Hong Kong.

  • Yu, T., & Jo, I. H. (2014). Educational technology approach toward learning analytics: Relationship between student online behavior and learning performance in higher education. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 269–270). ACM.

  • Zuboff, S. (1988). In the age of the smart machine: The future of work and power. New York: Basic Books.

    Google Scholar 

Download references

Funding

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2015S1A5B6036244).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Il-Hyun Jo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1. The request for expert review

Attachment (1) A draft instrument for evaluating a learning analytics dashboard

In the following criteria, indexes, and items, please indicate the level of importance of each item in consideration of external validity. Also, evaluate _____ based on your opinion and experience after you investigated the tool’s usability and effectiveness. Please select the number below that best represents how you think.

Criteria

Indexes

Items

※ In 21–30 items, “s” indicates an item for for students and “t” indicates for teachers.

Level of importance

Evaluation of OOO

1. Reaction

Goal-orientation

1. The dashboard identifies goals that present the specific information

  

2. The dashboard helps users monitor goal-related activities

  

Information usefulness

3. The dashboard displays the information that users want to know

  

4. The dashboard includes essential information only

  

Visual effectiveness

5. The dashboard consists of visual elements

  

6. The dashboard fits on a single computer screen

  

7. The dashboard presents visual information that users can scan at a glance

  

8. Visual elements in the dashboard are arranged in a way for rapid perception

  

Appropriation of visual representation

9. The dashboard includes proper graphic representations

  

10. Graphs in the dashboard appropriately represent the scales and units

  

11. The dashboard delivers information in a concise, direct and clear manner

  

12. The dashboard uses appropriate pre-attentive attributes such as form, color, spatial position, and motion

  

13. The dashboard displays information correctly on both desktop computers and mobile devices

  

User friendliness

14. The dashboard is easy to access

  

15. The dashboard is customized to users’ contexts

  

16. The dashboard has intuitive interfaces and menus to use easily

  

17. The dashboard allows users to explore more information that are embedded or hidden on the single page

  

2. Learning

Understanding

18. A user understands what the visual information in the dashboard implies

  

19. A user understands what the statistical information in the dashboard implies

  

20. A user is able to compare students’ status or positions in relation to overall activity pattern

  

Reflection

21s. A user monitors his/her own learning process consistently based on the information in the dashboard

  

21t. A user monitors student’s learning process consistently based on the information in the dashboard

  

22s. A user projects the information in the dashboard that is related to his/her learning activities

  

22t. A user projects the information in the dashboard that is related to his/her teaching activity

  

3. Behavior

Motivation increase

23s. A user is motivated to be engaged in learning as he/she reviews the dashboard

  

23t. A user is motivated to be engaged in studying his/her teaching approach as he/she reviews the dashboard

  

24s. A user makes plans for his/her own learning based on the information in the dashboard

  

24t. A user makes plans for his/her teaching and students’ management based on the information in the dashboard

  

Behavioral change

25s. A user manages his/her learning activities based on the dashboard

  

25t. A user manages his/her teaching activities based on the dashboard

  

26s. A user makes changes in learning patterns as he/she monitors the information in the dashboard

  

26t. A user makes changes in teaching interventions as he/she monitors the information in the dashboard

  

4. Result

Performance improvement

27s. The dashboard helps users to achieve their learning goal

  

27t. The dashboard helps users to achieve their instructional goal

  

28s. The dashboard enhances users’ academic achievement

  

28t. The dashboard enhances users’ teaching performance

  

Competency development

29. The dashboard enhances users’ self-management skill

  

30s. The dashboard enhances users’ social values and networking competency

  

30t. The dashboard enhances users’ teaching skill and student’ learning facilitation skill

  

Attachment #2: Questions for expert review

  1. 1.

    Do you think that the evaluation instrument including 30 items is well developed to measure the usability and effectiveness of educational dashboards? If not, please identify the problematic items and provide your suggestions regarding how to revise them.

  2. 2.

    Do you think more items should be included in this evaluation instrument? If so, please identify them with the reasons.

  3. 3.

    Do you think items should be removed from this evaluation instrument? If so, please identify them with the reasons.

  4. 4.

    Please provide your overall thoughts about this evaluation instrument. Your suggestions will be very helpful for improving the quality of this instrument.

Thank you so much for your time and valuable comments!

Appendix 2. Questions for evaluating the learning analytics dashboard and factor loadings

Items

Visual attraction

Usability

Understanding

Perceived usefulness

Behavioral changes

Q10. The additional line graphs in the dashboard were proper for scanning the information at a glance

.913

    

Q8. The scatter graph in the dashboard was proper for scanning the information at a glance

.869

    

Q9. The histograms in the dashboard were proper for scanning the information at a glance

.833

    

Q12. The dashboard delivers information in a concise manner

.612

    

Q11. The graphs in the dashboard appropriately represent the scales and units

.611

    

Q5. The dashboard delivered visual elements effectively

.501

    

Q6. The dashboard fits on my computer screen

.462

    

Q15. Interfaces in the dashboard are intuitive

 

.775

   

Q14. The functions of the dashboard were easily detected

 

.679

   

Q16. The dashboard allowed me to explore more information that was embedded or hidden on a single page (e.g., help, tips)

 

.578

   

Q7. The dashboard fits on my mobile device

 

.469

   

Q19. I understood what the statistical information in the dashboard implies

  

.908

  

Q18. I understood what the visual information in a dashboard implies

  

.871

  

Q17. I understood my status immediately through the dashboard

  

.546

  

Q20. I was easily able to compare my status or positions in relation to overall activity pattern of class

  

.441

  

Q3. The information in the dashboard was what I want to know

   

.802

 

Q4. The dashboard included essential information only

   

.586

 

Q2. The dashboard helped me monitor goal-related activities

   

.571

 

Q1. The dashboard identified goals that present the specific information

   

.473

 

Q29. The dashboard helped me change my time-management strategies not only for this class but also for other classes and my daily life

    

− .936

Q26. I changed my learning patterns or habits throughout the dashboard

    

− .895

Q30. The dashboard improved my general learning capacity

    

− .892

Q28. The dashboard helped me achieve my academic goal

    

− .879

Q25. I logged in virtual classroom more frequently by checking the dashboard

    

− .860

Q27. The dashboard helped me to achieve my learning goal

    

− .849

Q24. I made plans for my own learning based on the information in the dashboard

    

− .741

Q23. I was motivated to be engaged in learning as I reviewed the dashboard

    

− .697

Q22. I was able to plan my own learning based on the information in the dashboard

    

− .642

Appendix 3. Factor loading and variance extracted in the confirmatory factor analysis

Five latent variables

Factor loading

Standard error

t value

Standardized factor loading

Variance

AVE

Visual attraction

     

.630

 Q10

1

  

.876

.767

 

 Q8

.984

.057

17.128***

.807

.652

 

 Q11

.909

.053

17.248***

.81

.656

 

 Q9

1.014

.049

2.568***

.889

.79

 

 Q12

.924

.06

15.295***

.755

.57

 

 Q5

.955

.065

14.783***

.739

.547

 

 Q6

.826

.067

12.303***

.653

.427

 

Usability

     

.491

 Q15

1

  

.833

.694

 

 Q7

.62

.068

9.059***

.569

.324

 

 Q14

1.04

.086

12.118***

.744

.554

 

 Q16

.703

.07

10.103***

.628

.394

 

Understanding level

     

.592

 Q18

1

  

.907

.874

 

 Q19

.983

.041

24.006***

.717

.822

 

 Q17

.824

.055

14.91***

.683

.514

 

Q20

.833

.06

13.774***

.752

.467

 

Perceived usefulness

     

.625

 Q3

1

  

.721

.52

 

 Q2

1.103

.114

9.673***

.69

.476

 

 Q4

.895

.092

9.77***

.699

.488

 

 Q1

.866

.097

8.886***

.624

.39

 

Behavioral changes

     

.716

 Q28

1

  

.886

.784

 

 Q29

1.019

.047

21.81***

.89

.792

 

 Q26

1.09

.05

21.91***

.892

.795

 

 Q27

.997

.047

21.419***

.883

.779

 

 Q30

1.011

.048

21.007***

.876

.767

 

 Q25

1.066

.057

18.559***

.824

.68

 

 Q24

.91

.052

17.33***

.795

.632

 

 Q22

.915

.057

15.993***

.76

.578

 

 Q23

.93

.053

17.415***

.797

.635

 
  1. ***p < .001

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Park, Y., Jo, IH. Factors that affect the success of learning analytics dashboards. Education Tech Research Dev 67, 1547–1571 (2019). https://doi.org/10.1007/s11423-019-09693-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-019-09693-0

Keywords

Navigation