Factors that affect the success of learning analytics dashboards

  • Yeonjeong Park
  • Il-Hyun JoEmail author
Development Article


A learning analytics dashboard enables teachers and students to monitor and reflect on their online teaching and learning patterns. This study was a review of prior studies on learning analytics dashboards to show the need to develop an instrument for measuring dashboard success. An early version of the instrument based on the framework of Kirkpatrick’s four levels of evaluation was revised through expert reviews and exploratory factor analysis. The instrument contains five criteria: visual attraction, usability, level of understanding, perceived usefulness, and behavioral changes. The validity of the instrument was subsequently tested with factor analysis. A total of 271 samples from students who utilized a learning analytics dashboard for one semester were collected and analyzed using structural equation modeling. In the model with fair fit, the visual attraction and usability of the dashboard significantly affected the level of understanding, and level of understanding affected perceived usefulness, which in turn significantly affected potential behavior changes. The findings of this study have implications for designers who want to develop successful learning analytics dashboards, and further research is suggested related to measuring the cross validity of the evaluation instrument to broaden its usage.


Learning analytics Dashboard Factor analysis Structural equation modeling 



This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2015S1A5B6036244).

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.


  1. Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470–489.CrossRefGoogle Scholar
  2. Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.Google Scholar
  3. Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human–Computer Interaction, 24(6), 574–594.CrossRefGoogle Scholar
  4. Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405–418.CrossRefGoogle Scholar
  5. Brill, J., & Park, Y. (2011). Evaluating online tutorials for university faculty, staff, and students: The contribution of just-in-time online resources to learning and performance. International Journal on E-learning, 10(1), 5–26.Google Scholar
  6. Browne, M., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & L. S. Long (Eds.), Testing structural equation models (pp. 136–162). Newbury Park: Sage.Google Scholar
  7. Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of educational research, 65(3), 245–281.CrossRefGoogle Scholar
  8. Creswell, J. W. (2003). Research design. Thousand Oaks: Sage.Google Scholar
  9. Daniel, B. (2016). Big data and learning analytics in higher education. New York: Springer.Google Scholar
  10. Dawson, S., Bakharia, A., & Heathcote, E. (2010). SNAPP: Realising the affordances of real-time SNA within networked learning environments. Proceedings of the 7th International Conference on Networked Learning, pp. 125–133.Google Scholar
  11. Dick, W., Carey, L., & Carey, J. O. (2005). The systematic design of instruction (6th ed.). Boston: Allyn and Bacom.Google Scholar
  12. Eckerson, W. W. (2010). Performance dashboards: Measuring, monitoring, and managing your business (2nd ed.). New York: Wiley.Google Scholar
  13. Endsley, M. R. (2012). Designing for situation awareness: An approach to user-centered design. Baco Raton: CRC Press.Google Scholar
  14. Essa, A., & Ayad, H. (2012). Student success system: risk analytics and data visualization using ensembles of predictive models. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.Google Scholar
  15. Few, S. (2006). Information dashboard design. Newton: O’Reilly.Google Scholar
  16. Few, S. (2009). Now you see it: simple visualization techniques for quantitative analysis. Burlingame: Analytics Press.Google Scholar
  17. Few, S. (2012). Show me the numbers: Designing tables and graphs to enlighten. Burlingame: Analytics Press.Google Scholar
  18. Few, S. (2013). Information dashboard design: Displaying data for at-a-glance monitoring (2nd ed.). Burlingame: Analytics Press.Google Scholar
  19. Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. Paper presented at the CHI’12 Extended Abstracts on Human Factors in Computing Systems.Google Scholar
  20. Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models (4th ed.). New York: ERIC Clearninghouse on Information and Technology.Google Scholar
  21. Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis: A global perspective (7th ed.). New Jersey: Pearson Education, Upper Saddle River.Google Scholar
  22. Horton, W. (2001). Evaluating e-learning. ASTD (American Society for Training & Development): ASTD.Google Scholar
  23. Jo, I., & Kim, J. (2013a). Investigation of statistically significant period for achievement prediction model in e-learning. Journal of Educational Technology, 29(2), 285–306.CrossRefGoogle Scholar
  24. Jo, I., & Kim, Y. (2013b). Impact of learner's time management strategies on achievement in an e-learning environment: A learning analytics approach. Journal of Educational Information and Media, 19(1), 83–107.Google Scholar
  25. Jo, I. H., Kim, D., & Yoon, M. (2014). Analyzing the log patterns of adult learners in LMS using learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 183–187). ACM.Google Scholar
  26. Kahneman, D. (2011). Thinking, fast and slow. London: Macmillan.Google Scholar
  27. Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco: Berrett-Koehler.Google Scholar
  28. Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. Learning analytics (pp. 103–119). New York: Springer.Google Scholar
  29. Lambropoulos, N., Faulkner, X., & Culwin, F. (2012). Supporting social awareness in collaborative e-learning. British Journal of Educational Technology, 43(2), 295–306.CrossRefGoogle Scholar
  30. Ledden, L., Kalafatis, S. P., & Samouel, P. (2007). The relationship between personal values and perceived value of education. Journal of Business Research, 60(9), 965–974.CrossRefGoogle Scholar
  31. Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012). GLASS: A learning analytics visualization tool. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 162–163). ACM.Google Scholar
  32. Mavroudi, A., Giannakos, M., & Krogstie, J. (2018). Supporting adaptive learning pathways through the use of learning analytics: developments, challenges and future opportunities. Interactive Learning Environments, 26(2), 206–220.CrossRefGoogle Scholar
  33. Mazza, R., & Milani, C. (2004). Gismo: a graphical interactive student monitoring tool for course management systems. Paper presented at the Technology Enhanced Learning Conference, Milan.Google Scholar
  34. Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures: Issues and applications. London: Sage.CrossRefGoogle Scholar
  35. Papamitsiou, Z., & Economides, A. A. (2016). Learning analytics for smart learning environments: A meta-analysis of empirical research results from 2009 to 2015. Learning, Design, and Technology. Scholar
  36. Park, Y., & Jo, I. (2015). Development of learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science, 21(1), 110–133.Google Scholar
  37. Pedhazur, E. J., & Schmelkin, L. P. (2013). Measurement, design, and analysis: An integrated approach. London: Psychology Press.CrossRefGoogle Scholar
  38. Podgorelec, V., & Kuhar, S. (2011). Taking advantage of education data: Advanced data analysis and reporting in virtual learning environments. Electronics and Electrical Engineering, 114(8), 111–116.Google Scholar
  39. Reeves, T. C., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., … Loh, S. (2002). Usability and instructional design heuristics for E-learning evaluation.Google Scholar
  40. Romero, C., Espejo, P. G., Zafra, A., Romero, J. R., & Ventura, S. (2013). Web usage mining for predicting final marks of students that use Moodle courses. Computer Applications in Engineering Education, 21(1), 135–146.CrossRefGoogle Scholar
  41. Santos, J. L., Govaerts, S., Verbert, K., & Duval, E. (2012). Goal-oriented visualizations of activity tracking: a case study with engineering students. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge.Google Scholar
  42. Santos, J. L., Verbert, K., Govaerts, S., & Duval, E. (2013). Addressing learner issues with StepUp!: an Evaluation. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.Google Scholar
  43. Scheuer, O., & Zinn, C. (2007). How did the e-learning session go? The Student Inspector. Frontiers in Artifical Intelligence and Applications, 158, 487.Google Scholar
  44. Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior, 78, 397–407.CrossRefGoogle Scholar
  45. Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2018). Linking learning behavior analytics and learning science concepts: designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior. Scholar
  46. Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New York: Springer.Google Scholar
  47. Upton, K., & Kay, J. (2009). Narcissus: group and individual models to support small group work. In F. Ricci (Ed.), User modeling, adaptation, and personalization (pp. 54–65). New York: Springer.CrossRefGoogle Scholar
  48. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.CrossRefGoogle Scholar
  49. Wong, G. K. (2016). The behavioral intentions of Hong Kong primary teachers in adopting educational technology. Educational Technology Research and Development, 64(2), 313–338.CrossRefGoogle Scholar
  50. Yoo, Y., Lee, H., Jo, I., & Park, Y. (2014). Educational dashboards for smart learning: Review of case studies. Paper presented at the International conference on smart learning enviornment 2014, Hong Kong.Google Scholar
  51. Yu, T., & Jo, I. H. (2014). Educational technology approach toward learning analytics: Relationship between student online behavior and learning performance in higher education. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 269–270). ACM.Google Scholar
  52. Zuboff, S. (1988). In the age of the smart machine: The future of work and power. New York: Basic Books.Google Scholar

Copyright information

© Association for Educational Communications and Technology 2019

Authors and Affiliations

  1. 1.Center for Teaching and LearningHonam UniversityKwangjuSouth Korea
  2. 2.Department of Educational Technology, College of EducationEwha Womans UniversitySeoulSouth Korea

Personalised recommendations