Skip to main content

Understanding the Role of Time on Task in Formative Assessment: The Case of Mathematics Learning

  • Conference paper
  • First Online:
Computer Assisted Assessment. Research into E-Assessment (TEA 2015)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 571))

Included in the following conference series:

Abstract

Mastery data derived from formative assessments constitute a rich data set in the development of student performance prediction models. The dominance of formative assessment mastery data over use intensity data such as time on task or number of clicks was the outcome of previous research by the authors in a dispositional learning analytics context [13]. Practical implications of these findings are far reaching, contradicting current practices of developing (learning analytics based) student performance prediction models based on intensity data as central predictor variables. In this empirical follow-up study using data of 2011 students, we search for an explanation for time on task data being dominated by mastery data. We do so by investigating more general models, allowing for nonlinear, even non-monotonic, relationships between time on task and performance measures. Clustering students into subsamples, with different time on task characteristics, suggests heterogeneity of the sample to be an important cause of the nonlinear relationships with performance measures. Time on task data appear to be more sensitive to the effects of heterogeneity than mastery data, providing a further argument to prioritize formative assessment mastery data as predictor variables in the design of prediction models directed at the generation of learning feedback.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Tempelaar, D.T., Rienties, B., Giesbers, B.: Computer assisted, formative assessment and dispositional learning analytics in learning mathematics and statistics. In: Kalz, M., Ras, E. (eds.) CAA 2014. CCIS, vol. 439, pp. 67–78. Springer, Heidelberg (2014). doi:10.1007/978-3-319-08657-6_7

    Google Scholar 

  2. Tempelaar, D.T., Rienties, B., Giesbers, B.: In search for the most informative data for feedback generation: learning analytics in a data-rich context. Comput. Hum. Behav. (Spec. Issue Learn. Analytics) (2015). doi:10.1016/j.chb.2014.05.038

  3. Tempelaar, D.T., Cuypers, H., Van de Vrie, E.M., Heck, A., Van der Kooij, H.: Formative assessment and learning analytics. In: Proceedings LAK2013: 3rd International Conference on Learning Analytics and Knowledge, pp. 205–209. ACM Press: New York (2013). doi:10.1145/2460296.2460337

  4. Calvert, C.E.: Developing a model and applications for probabilities of student success: a case study of predictive analytics. Open Learn. J. Open Distance e-Learn. 29(2), 160–173 (2014). doi:10.1080/02680513.2014.931805

    Article  MathSciNet  Google Scholar 

  5. Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., Wolff, A.: OU Analyse: Analysing at-risk students at The Open University, Learning Analytics Review, Paper LAK15-1, ISSN 2057-7494, March 2015. http://www.laceproject.eu/learning-analytics-review/analysing-at-risk-students-at-open-university/

  6. Buckingham Shum, S. Deakin Crick, R.: Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics. In: Proceedings LAK2012: 2nd International Conference on Learning Analytics and Knowledge, pp. 92–101. ACM Press: New York (2012)

    Google Scholar 

  7. Pachler, N., Mellar, H., Daly, C., Mor, Y., Wiliam, D.: Scoping a vision for formative e-assessment: a project report for JISC, version 2, April (2009). http://www.wlecentre.ac.uk/cms/files/projectreports/scoping_a_vision_for_formative_e-assessment_version_2.0.pdf

  8. Black, P., Wiliam, D.: Developing the theory of formative assessment. Assess. Eval. Accountability 21(1), 5–31 (2009)

    Article  Google Scholar 

  9. Tempelaar, D.T.; Kuperus, B., Cuypers, H., Van der Kooij, H., Van de Vrie, E., Heck, A.: The role of digital, formative testing in e-Learning for mathematics: a case study in the Netherlands. In: “Mathematical e-learning” [online dossier]. Universities and Knowledge Society Journal (RUSC). vol. 9, no 1, UoC (2012). doi:10.7238/rusc.v9i1.1272

  10. Verbert, K., Manouselis, N., Drachsler, H., Duval, E.: Dataset-driven research to support learning and knowledge analytics. Educ. Technol. Soc. 15(3), 133–148 (2012)

    Google Scholar 

  11. Rienties, B., Toetenel, L., Bryan, A.: “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. In: Proceedings LAK 2015: 5th International Conference on Learning Analytics and Knowledge, pp. 315–319. ACM Press, New York (2015). doi:10.1145/2723576.2723600

  12. Ferguson, R., Clow, D.: Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs). In: Proceedings LAK 2015: 5th International Conference on Learning Analytics and Knowledge, pp. 51–58. ACM Press, New York (2015). doi:10.1145/2723576.2723606

  13. Tempelaar, D.T., Rienties, B., Giesbers, B.: Stability and sensitivity of Learning Analytics based prediction models. In: Helfert, M., Restivo, M.T., Zvacek, S., Uhomoibhi, J. (eds.) Proceedings CSEDU 2015, 7th International Conference on Computer Supported Education, vol. 1, pp. 156–166. SCITEPRESS, Lisbon (2015)

    Google Scholar 

  14. Narciss, S.: Feedback strategies for interactive learning tasks. In: Spector, J.M., Merrill, M.D., van Merrienboer, J.J.G., Driscoll, M.P. (eds.) Handbook of Research on Educational Communications and Technology, 3rd edn, pp. 125–144. Lawrence Erlbaum Associates, Mahwah (2008)

    Google Scholar 

  15. Narciss, S., Huth, K.: Fostering achievement and motivation with bug-related tutoring feedback in a computer-based training on written subtraction. Learn. Instr. 16, 310–322 (2006)

    Article  Google Scholar 

  16. Martin, A.J.: Examining a multidimensional model of student motivation and en-gagement using a construct validation approach. Br. J. Educ. Psychol. 77, 413–440 (2007)

    Article  Google Scholar 

  17. Tempelaar, D.T., Niculescu, A., Rienties, B., Giesbers, B., Gijselaers, W.H.: How achievement emotions impact students’ decisions for online learning, and what precedes those emotions. Internet High. Educ. 15, 161–169 (2012). doi:10.1016/j.iheduc.2011.10.003

    Article  Google Scholar 

  18. Pekrun, R., Goetz, T., Frenzel, A.C., Barchfeld, P., Perry, R.P.: Measuring emotions in students’ learning and performance: the achievement emotions questionnaire (AEQ). Contemp. Educ. Psychol. 36, 36–48 (2011)

    Article  Google Scholar 

  19. Rienties, B., Rivers, B.A.: Measuring and understanding learner emotions: evidence and prospects. Learning Analytics Review, no. 1, December 2014, ISSN 2057-7494. http://www.laceproject.eu/learning-analytics-review/measuring-and-understanding-learner-emotions/

  20. Perry, R.P., Hladkyj, S., Pekrun, R.H., Clifton, R.A., Chipperfield, J.G.: Perceived academic control and failure in college students: a three-year study of scholastic attainment. Res. High. Educ. 46, 535–569 (2005)

    Article  Google Scholar 

  21. Babaali, P., Gonzalez, L.: A quantitative analysis of the relationship between an online homework system and student achievement in pre-calculus. Int. J. Math. Educ. Sci. Technol. (2015). doi:10.1080/0020739X.2014.997318

Download references

Acknowledgements

The project reported here has been supported and co-financed by SURF-foundation as part of the Learning Analytics Stimulus and the Testing and Test-Driven Learning programs.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dirk T. Tempelaar .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Tempelaar, D.T., Rienties, B., Giesbers, B. (2015). Understanding the Role of Time on Task in Formative Assessment: The Case of Mathematics Learning. In: Ras, E., Joosten-ten Brinke, D. (eds) Computer Assisted Assessment. Research into E-Assessment. TEA 2015. Communications in Computer and Information Science, vol 571. Springer, Cham. https://doi.org/10.1007/978-3-319-27704-2_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27704-2_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27703-5

  • Online ISBN: 978-3-319-27704-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics