Advertisement

An Architectural Perspective of Learning Analytics

  • Arvind W. KiwelekarEmail author
  • Manjushree D. Laddha
  • Laxman D. Netak
  • Sanil Gandhi
Chapter
Part of the Intelligent Systems Reference Library book series (ISRL, volume 158)

Abstract

Tools for learning analytics are becoming essential features of Learning Management Systems (LMS) and various course delivery platforms. These tools collect data from online learning platforms, analyze the collected data, and present the extracted information in a visually appealing manner. Representing the design-level concerns of such tools is one of the significant challenges faced by software developers. One way of overcoming this challenge is to adopt architectural perspectives which is a mechanism used by software architects to capture high-level design concerns. In this Chapter, we present an architectural perspective of such learning analytics tools and components. The primary objective of the chapter is to describe the functional elements and non-functional properties supported by such tools. Further, the chapter describes various techniques for realizing these functional and non-functional elements. Such an architectural perspective is useful in two different ways. First, the design knowledge represented through an architectural perspective is potentially useful to communicate the design and implementation of a learning analytics based system. Second, the architectural perspectives can also be used to evaluate the design of the tools in achieving their stated goals.

References

  1. 1.
    Aderaldo, C.M., Mendonça, N.C., Pahl, C., Jamshidi, P.: Benchmark requirements for microservices architecture research. In: Proceedings of the 1st International Workshop on Establishing the Community-Wide Infrastructure for Architecture-Based Software Engineering, ECASE ’17, pp. 8–13, Piscataway, NJ, USA. IEEE Press (2017)Google Scholar
  2. 2.
    Adjei, S.A., Botelho, A.F., Heffernan, N.T.: Predicting student performance on post-requisite skills using prerequisite skill data: an alternative method for refining prerequisite skill structures. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, pp. 469–473. ACM (2016)Google Scholar
  3. 3.
    Adjei, S.A., Botelho, A.F., Heffernan, N.T.: Sequencing content in an adaptive testing system: the role of choice. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 178–182. ACM (2017)Google Scholar
  4. 4.
    Aghababyan, A., Lewkow, N., Baker, R.: Exploring the asymmetry of metacognition. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 115–119. ACM (2017)Google Scholar
  5. 5.
    Agnihotri, L., Essa, A., Baker, R.S.: Impact of student choice of content adoption delay on course outcomes. In: LAK, pp. 16–20 (2017)Google Scholar
  6. 6.
    Allen, L.K., Mills, C., Jacovina, M.E., Crossley, S., D’mello, S., McNamara, D.S.: Investigating boredom and engagement during writing using multiple sources of information: the essay, the writer, and keystrokes. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, pp. 114–123. ACM (2016)Google Scholar
  7. 7.
    Allen, L.K., Perret, C.A., Likens, A.D., McNamara, D.S.: What’d you say again?: recurrence quantification analysis as a method for analyzing the dynamics of discourse in a reading strategy tutor. In: LAK, pp. 373–382 (2017)Google Scholar
  8. 8.
    Andrade, A.: Understanding student learning trajectories using multimodal learning analytics within an embodied-interaction learning environment. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, LAK ’17, pp. 70–79, New York, NY, USA. ACM (2017)Google Scholar
  9. 9.
    Avila, C., Baldiris, S., Fabregat, R., Graf, S.: ATCE: an analytics tool to trace the creation and evaluation of inclusive and accessible open educational resources. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 183–187. ACM (2017)Google Scholar
  10. 10.
    Bachmann, F., Bass, L., Klein, M.: Deriving architectural tactics: a step toward methodical architectural design. Technical report, Carnegie-Mellon University, Pittsburgh, PA, Software Engineering Institute (2003)Google Scholar
  11. 11.
    Barnes, T.: The q-matrix method: mining student response data for knowledgeGoogle Scholar
  12. 12.
    Bøegh, J.: A new standard for quality requirements. IEEE Softw. 2, 57–63 (2008)CrossRefGoogle Scholar
  13. 13.
    Bos, N., Brand-Gruwel, S.: Student differences in regulation strategies and their use of learning resources: implications for educational design. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, pp. 344–353. ACM (2016)Google Scholar
  14. 14.
    Brunskill, E.: Estimating prerequisite structure from noisy data. In: EDM, pp. 217–222. Citeseer (2011)Google Scholar
  15. 15.
    Burns, S., Corwin, K.: Automating student survey reports in online education for faculty and instructional designers. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 590–591. ACM (2017)Google Scholar
  16. 16.
    Buschmann, F., Meunier, R., Rohnert, H., Sommerlad, P., Stal, M.: Pattern-Oriented Software Architecture, Volume 1: A System of Patterns (1996)Google Scholar
  17. 17.
    Chen, L., Dubrawski, A.: Learning from learning curves: discovering interpretable learning trajectories. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 544–545. ACM (2017)Google Scholar
  18. 18.
    Chiu, M.M., Chow, B.W.-Y., Joh, S.W.: How to assign students into sections to raise learning. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 95–104. ACM (2017)Google Scholar
  19. 19.
    Corrin, L., de Barba, P.G., Bakharia, A.: Using learning analytics to explore help-seeking learner profiles in MOOCs. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, LAK ’17, pp. 424–428, New York, NY, USA. ACM (2017)Google Scholar
  20. 20.
    Desmarais, M.C., Maluf, A., Liu, J.: User-expertise modeling with empirically derived probabilistic implication networks. User Model. User Adapt. Interact. 5(3-4), 283–315 (1995)Google Scholar
  21. 21.
    Esaki, K., Azuma, M., Komiyama, T.: Introduction of quality requirement and evaluation based on ISO/IEC square series of standard. In: International Conference on Trustworthy Computing and Services, pp. 94–101. Springer (2012)Google Scholar
  22. 22.
    Ferguson, R.: Learning analytics: drivers, developments and challenges. Int. J. Technol. Enhanc. Learn. 4(5–6), 304–317 (2012)CrossRefGoogle Scholar
  23. 23.
    Gorton, I.: Software quality attributes. In: Essential Software Architecture, pp. 23–38. Springer (2011)Google Scholar
  24. 24.
    Harlen, W., James, M.: Assessment and learning: differences and relationships between formative and summative assessment. Assess. Educ. Princ. Policy Pract. 4(3), 365–379 (1997)Google Scholar
  25. 25.
    Jayaprakash, S.M., Moody, E.W., Lauría, E.J.M., Regan, J.R., Baron, J.D.: Early alert of academically at-risk students: an open source analytics initiative. J. Learn. Anal. 1(1), 6–47 (2014)Google Scholar
  26. 26.
    Nabeel, M., Shang, N., Bertino, E.: Efficient privacy preserving content based publish subscribe systems. In: Proceedings of the 17th ACM Symposium on Access Control Models and Technologies, pp. 133–144. ACM (2012)Google Scholar
  27. 27.
    Paliwal, G., Kiwelekar, A.W.: A product line architecture for mobile patient monitoring system. In: Mobile Health, pp. 489–511. Springer (2015)Google Scholar
  28. 28.
    Rozanski, N., Woods, E.: Software Systems Architecture: Working With Stakeholders Using Viewpoints and Perspectives, 2nd edn. Addison-Wesley Professional (2011)Google Scholar
  29. 29.
    Sreenivasa Sarma, B.H., Ravindran, B.: Intelligent tutoring systems using reinforcement learning to teach autistic students. In: Home Informatics and Telematics: ICT for The Next Billion, pp. 65–78. Springer (2007)Google Scholar
  30. 30.
    Segall, D.O.: Computerized adaptive testing. Encycl. Soc. Meas. 1, 429–438 (2005)Google Scholar
  31. 31.
    Sprague, A.: Improving the ESL graduate writing classroom using socrative: (re)considering exit tickets. TESOL J. 7(4), 989–998 (2016)CrossRefGoogle Scholar
  32. 32.
    Thönes, J.: Microservices. IEEE Softw. 32(1), 116–116 (2015)CrossRefGoogle Scholar
  33. 33.
    Trossen, D., Sarela, M., Sollins, K.: Arguments for an information-centric internetworking architecture. SIGCOMM Comput. Commun. Rev. 40(2), 26–33 (2010)CrossRefGoogle Scholar
  34. 34.
    Uram, T.D., Papka, M.E., Hereld, M., Wilde, M.: A solution looking for lots of problems: generic portals for science infrastructure. In: Proceedings of the 2011 TeraGrid Conference: Extreme Digital Discovery, TG ’11, pp. 44:1–44:7, New York, NY, USA. ACM (2011)Google Scholar
  35. 35.
    Van der Linden, W.J., Glas, C.A.W.: Computerized Adaptive Testing: Theory and Practice. Springer (2000)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Arvind W. Kiwelekar
    • 1
    Email author
  • Manjushree D. Laddha
    • 1
  • Laxman D. Netak
    • 1
  • Sanil Gandhi
    • 1
  1. 1.Department of Computer EngineeringDr. Babasaheb Ambedkar Technological UniversityLonere, RaigadIndia

Personalised recommendations