Advertisement

The Importance of E-Portfolios for Effective Student-Facing Learning Analytics

  • Cath EllisEmail author
Chapter

Abstract

The field of Academic Analytics offers considerable potential to Higher Education institutions (HEIs), the academic staff who work for them and, most importantly, the students they teach. This approach to data-led decision-making is starting to have an influence and impact on what is arguably the core business of Higher Education: student learning. As well as being nascent, Learning Analytics is, potentially at least, a very broad area of inquiry and development; the field, necessarily, therefore has significant gaps. It is also just one of a large number of changes and developments that are affecting the way that Higher Education operates. These changes include such things as the introduction of standards-based assessment and outcomes-based education, and the identification and warranting of core competencies and capabilities of university graduates. It is also happening at a time when the affordances of a wide variety of eLearning tools are introducing new possibilities and opportunities to the pedagogy of Higher Education in ways that are demonstrably challenging traditional approaches to teaching and learning, something Sharpe and Oliver famously refer to as the ‘trojan mouse’ (Sharpe and Oliver In Designing courses for e-learning. Rethinking Pedagogy for a Digital Age, Designing and delivering e-learning, pp. 41–51, 2007, p. 49). This chapter considers the role that one such eLearning tool—the e-portfolio—can play in the implementation of a student-facing Learning Analytics strategy in this ambitious new approach to conceptualising, facilitating, structuring, supporting and assuring student learning achievement.

Keywords

Learning Analytics Assessment Analytics E-portfolios Assessment and feedback Self-regulated learning 

References

  1. Avis, J. (2000). Policing the subject: Learning outcomes, managerialism and research in PCET. British Journal of Educational Studies, 48(1), 38–57. doi: 10.1111/1467-8527.00132 CrossRefGoogle Scholar
  2. Aviv, R., Erlich, Z., Ravid, G., & Geva, A. (2003). Network analysis of knowledge construction in asynchronous learning networks. Journal of Asynchronous Learning Networks, 7(3), 1–23.Google Scholar
  3. Bach, C. (2010). Learning analytics: Targeting instruction, curricula and student support. Office of the Provost: Drexel University.Google Scholar
  4. Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364.CrossRefGoogle Scholar
  5. Biggs, J. (1999). What the student does: Teaching for enhanced learning. Higher Education Research & Development, 18(1), 57–75.CrossRefGoogle Scholar
  6. Boud, D., & Molloy, E. (2013). Feedback in higher and professional education: understanding it and doing it well. London and New York: Routledge.Google Scholar
  7. Buckley, S., Coleman, J., Davison, I., Khan, K. S., Zamora, J., Malick, S., … & Sayers, J. (2009). The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME guide no. 11. Medical teacher, 31(4), 282–298. http://doi.org/10.1080/01421590902889897
  8. Campbell, J., & Oblinger, D. (2007). Academic analytics. EDUCAUSE Centre for Applied Research. Retrieved from http://connect.educause.edu/library/abstract/AcademicAnalytics/45275
  9. Clegg, S. (2011). Cultural capital and agency: Connecting critique and curriculum in higher education. British Journal of Sociology of Education, 32(1), 93–108. doi: 10.1080/01425692.2011.527723 CrossRefGoogle Scholar
  10. Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Australian Learning and Teaching Council. Retrieved from http://olt.ubc.ca/learning_tools/research_1/research/
  11. De Laat, M., Lally, V., Lipponen, L., & Simons, R. J. (2006). Analysing student engagement with learning and tutoring activities in networked learning communities: A multi-method approach. International Journal of Web Based Communities, 2(4), 394–412.CrossRefGoogle Scholar
  12. Drachsler, H., Bogers, T., Vuorikari, R., Verbert, K., Duval, E., Manouselis, N., … others. (2010). Issues and considerations regarding sharable data sets for recommender systems in technology enhanced learning. Procedia Computer Science, 1(2), 2849–2858.Google Scholar
  13. Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664. doi: 10.1111/bjet.12028 CrossRefGoogle Scholar
  14. Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges (technical report). UK: Knowledge Media Institute, The Open University.Google Scholar
  15. Home—Guides.turnitin.com. (n.d.). Retrieved April 22, 2016, from https://guides.turnitin.com/
  16. Hughes, J. (2008). Letting in the Trojan mouse: Using an e-portfolio system to re-think pedagogy. Retrieved from https://wlv.openrepository.com/wlv/handle/2436/47434
  17. Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon report. Austin, Texas: The New Media Consortium.Google Scholar
  18. LAK. (n.d.). First international conference on learning analytics and knowledge 2011 (conference). Retrieved from https://tekri.athabascau.ca/analytics/call-papers
  19. Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUX workflow: Designing and deploying awareness tools in technology-enabled learning settings. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (pp. 1–10). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2723583
  20. Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2016). Latux: An iterative workflow for designing, validating and deploying learning analytics visualisations. Journal of Learning Analytics, 2(3), 9–39. doi: 10.18608/jla.2015.23.3 CrossRefGoogle Scholar
  21. Maryika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Byers, A. H. (n.d.). Big data: The next frontier for innovation, competition, and productivity| McKinsey Global Institute| Technology & Innovation| McKinsey & Company. Retrieved April 22, 2012, from http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation
  22. Maton, K. (2009). Cumulative and segmented learning: Exploring the role of curriculum structures in knowledge-building. British Journal of Sociology of Education, 30(1), 43–57. doi: 10.1080/01425690802514342 CrossRefGoogle Scholar
  23. Moon, J. A. (2007). Learning journals (2nd ed.). Taylor & Francis.Google Scholar
  24. Sadler, D. R. (2007). Perils in the meticulous specification of goals and assessment criteria. Assessment in Education, 14(3), 387–392. doi: 10.1080/09695940701592097 CrossRefGoogle Scholar
  25. Sadler, D. R. (2009a). Grade integrity and the representation of academic achievement. Studies in Higher Education, 34(7), 807–826. doi: 10.1080/03075070802706553 CrossRefGoogle Scholar
  26. Sadler, D. R. (2009b). Indeterminacy in the use of preset criteria for assessment and grading. Assessment & Evaluation in Higher Education, 34(2), 159–179. doi: 10.1080/02602930801956059 CrossRefGoogle Scholar
  27. Sadler, D. R. (2010a). Beyond feedback: Developing student capability in complex appraisal. Assessment & Evaluation in Higher Education, 35(5), 535–550. doi: 10.1080/02602930903541015 CrossRefGoogle Scholar
  28. Sadler, D. R. (2010b). Fidelity as a precondition for integrity in grading academic achievement. Assessment & Evaluation in Higher Education, 35(6), 727–743. doi: 10.1080/02602930902977756.CrossRefGoogle Scholar
  29. Sadler, D. R. (2015). Backwards assessment explanations: Implications for teaching and assessment practice. Springer International Publishing. Retrieved from http://link.springer.com/chapter/10.1007/978-3-319-10274-0_2
  30. Sharpe, R., & Oliver, M. (2007). Designing courses for e-learning (pp. 41–51). Rethinking Pedagogy for a Digital Age: Designing and delivering e-learning.Google Scholar
  31. Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., … Baker, R. S. J. d. (2011). Open learning analytics: An integrated and modularized platform. Society for learning analytics research. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf
  32. Taras, M. (2001). The use of tutor feedback and student self-assessment in summative assessment tasks: Towards transparency for students and for tutors. Assessment & Evaluation in Higher Education, 26(6), 605–614.CrossRefGoogle Scholar
  33. Van Tartwijk, J., & Driessen, E. W. (2009). Portfolios for assessment and learning: AMEE guide no. 45. Medical Teacher, 31(9), 790–801. doi: 10.1080/01421590903139201 CrossRefGoogle Scholar
  34. Verbert, K., Drachsler, H., Manouselis, N., Wolpers, M., Vuorikari, R., & Duval, E. (2011). Dataset-driven research for improving recommender systems for learning. In (LAK 2011). Retrieved from http://dl.acm.org/ft_gateway.cfm?id=2090122&type=pdf

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  1. 1.University of New South WalesSydneyAustralia

Personalised recommendations