Skip to main content

Lessons Learned from a Calculus E-Learning System for First-Year University Students with Diverse Mathematics Backgrounds

  • Chapter
  • First Online:
Book cover Distance Learning, E-Learning and Blended Learning in Mathematics Education

Part of the book series: ICME-13 Monographs ((ICME13Mo))

Abstract

First-year science majors at The University of Hong Kong have different levels of proficiency in mathematics, with a significant proportion lacking the necessary calculus background for a compulsory freshman science foundation course. A supplementary calculus e-learning platform was implemented so that students lacking the prerequisite could gain the necessary knowledge and skills at their own pace. This chapter presents quantitative and qualitative analyses of the learning analytics, including the behavior as well as the achievements of the users. Pretest and posttest results are used to assess the effectiveness of the platform. Questionnaires completed by the users are utilized to explore aspects for improvement. We hope this study can stimulate discussions on the assessment of e-learning, as well as shed light on the factors contributing to the efficiency and effectiveness of similar platforms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The t-test allowing for unequal variance is used as an F-test comparing the variances of pretest scores of the two groups suggests that we cannot treat the variances as the same at 1% significance level.

  2. 2.

    Due to limitations in the user log, we define “watching a video” as clicking the video link. The assumption is that users watch the video once every time they click the link. Unfortunately, this cannot capture the case where a user stays on the same page and watch a video over and over again.

  3. 3.

    For example, if a user had clicked on the link of one particular video n times throughout the semester, his/her qvideo would be n while cvideo would only be 1.

  4. 4.

    Average frequency refers to the number of times a student watched the same video or submitted the same quiz. For instance, if a student has watched 10 different videos (cvideo = 10) with a total view count of 30 (qvideo = 30), then his/her average frequency of watching videos is defined as qvideo/cvideo = 3.

  5. 5.

    As for column (1) of Table 5.4, the same conclusion can still be reached if the independent variable is students’ self-reported calculus background (rBackground) instead of their pretest score.

  6. 6.

    Originally, we also wanted to analyze whether and how students’ prior knowledge affects the substitute/complement relationship between the videos and the quizzes. Hence, the interaction effect between cvideo and pretest was initially included in the full model. At the significance level of 5%, the interaction effect was not statistically significant. In other words, the relationship between quizzes and videos is not affected by a student’s calculus knowledge background. In columns (8) and (9), therefore, only the main effects of cvideo and pretest are included.

  7. 7.

    For instance, students may have managed to find practice questions from other sources.

  8. 8.

    Defined as assigning number 3, 4, 5 to the statement “the quizzes aligned closely with the content of the videos” on a Likert Scale of 1–5: 1 means “strongly disagree” and 5 means “strongly agree”.

  9. 9.

    Only 16 and 14% of the students respectively disagree or strongly disagree with the statement that “Compared with the midterm/final examination, the quizzes were too easy”.

  10. 10.

    For instance, in the course dimension part of the survey, an optional question asked “What gave you a better learning experience than the e-learning platform in Calculus learning, if any?”.

  11. 11.

    In the quiz part of the survey, a question asked if there is “any other reasons that stopped/prevented you (the students) from working on the quizzes”. A number of respondents claim the quizzes are time-consuming or they do not have enough time for revision.

  12. 12.

    The average is defined as the total number of videos/quizzes accessed every day divided by the total number of students with certain mathematical background. 71 students reported themselves as having no prior calculus background, and 154 students reported they have.

  13. 13.

    This is observed in students’ response to the question “I would suggest the following changes for improvement:”. In addition, 88.5% of the respondents agreed with the proposal that “the instructors should suggest a timeline that we finish certain modules of the e-learning platform”.

References

  • Albano, G. (2011). Mathematics education: Teaching and learning opportunities in blended learning. In A. Juan, A. Huertas, S. Trenholm, & C. Steegmann (Eds.), Teaching mathematics online: Emergent technologies and methodologies (pp. 60–89). Hershey, PA: Information Science Reference.

    Google Scholar 

  • Angrist, J., & Lavy, V. (2002). New evidence on classroom computers and pupil learning. The Economic Journal, 112(482), 735–765. https://doi.org/10.1111/1468-0297.00068.

  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy, and Practice, 5, 7–74.

    Article  Google Scholar 

  • Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54(4), 1222–1232.

    Article  Google Scholar 

  • Clark, D. (2002). Psychological myths in e-learning. Medical Teacher, 24, 598–604.

    Article  Google Scholar 

  • Daradoumis, T., Bassi, R., Xhafa, F., & Caballé, S. (2013). A review on massive e-learning (MOOC) design, delivery and assessment. In F. Xhafa, L. Barolli, D. Nace, S. Venticinque, & A. Bui (Eds.), 2013 Eighth International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC 2013) (pp. 208–213). Institute of Electrical and Electronics Engineers. ISBN: 978-0-7695-5094-7.

    Google Scholar 

  • Daza, V., Makriyannis, N., & Rovira Riera, C. (2013). MOOC attack: Closing the gap between pre-university and university mathematics. Open Learning: The Journal of Open, Distance and e-Learning, 28, 227–238.

    Article  Google Scholar 

  • De Witte, K., Haelermans, C., & Rogge, N. (2015). The effectiveness of a computer-assisted math learning program. Journal of Computer Assisted learning, 31(4), 314–329.

    Article  Google Scholar 

  • Galligan, L., & Hobohm, C. (2015). Investigating students’ academic numeracy in 1st level university courses. Mathematics Education Research Journal, 27(2), 129–145.

    Article  Google Scholar 

  • Gibbs, A. L. (2014). Experiences teaching an introductory statistics MOOC. In K. Makar, B. de Sousa, & R. Gould (Eds.), Sustainability in statistics education. Proceedings of the Ninth International Conference on Teaching Statistics (ICOTS9, July, 2014), Flagstaff, Arizona, USA. Voorburg: The Netherlands: International Statistical Institute.

    Google Scholar 

  • Hlavac, M. (2015). Stargazer: Well-formatted regression and summary statistics tables. R package version 5.2. Online: http://CRAN.R-project.org/package=stargazer.

  • Jacobson, M., & Archodidou, A. (2000). The design of hypermedia tools for learning: Fostering conceptual change and transfer of complex scientific knowledge. Journal of the Learning Sciences, 9, 149–199.

    Article  Google Scholar 

  • Jungic, V., Kent, D., & Menz, P. (2012). On online assignments in a calculus class. Journal of University Teaching & Learning Practice, 9(1), 3.

    Google Scholar 

  • Karpicke, J. D., & Roediger, H. L. (2008). The critical importance of retrieval for learning. Science, 319, 966–968.

    Article  Google Scholar 

  • Machin, S., McNally, S., & Silva, O. (2007). New technology in schools: Is there a payoff? The Economic Journal, 117(522), 1145–1167.

    Article  Google Scholar 

  • Mayes, J. T., & Fowler, C. J. (1999). Learning technology and usability: A framework for understanding courseware. Interacting with Computers, 11(5), 485–497. https://doi.org/10.1016/S0953-5438(98)00065-4.

  • Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education, 46(2), 211–233.

    Article  Google Scholar 

  • Papastergiou, M. (2009). Digital game-based learning in high school computer science education: Impact on educational effectiveness and student motivation. Computers & Education, 52(1), 1–12. https://doi.org/10.1016/j.compedu.2008.06.004.

  • Potocki, A., Ecalle, J., & Magnan, A. (2013). Effects of computer-assisted comprehension training in less skilled comprehenders in second grade: A one-year follow-up study. Computers and Education, 63, 131–140. https://doi.org/10.1016/j.compedu.2012.12.011.

  • Robinson, C. C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84(2), 101–108.

    Article  Google Scholar 

  • Rouse, C. E., & Krueger, A. B. (2004). Putting computerized instruction to the test: A randomized evaluation of a “scientifically based” reading program. Economics of Education Review, 23(4), 323–338. https://doi.org/10.1016/j.econedurev.2003.10.005.

  • Seppälä, M., Caprotti, O., & Xambó, S. (2006). Using web technologies to teach mathematics. In Proceedings of Society for Information Technology and Teacher Education International Conference, Orlando, FL (pp. 2679–2684), March 19, 2006.

    Google Scholar 

  • Sun, P.-C., Tsa, R. J., Finger, G., Chen, Y.-Y., Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education, 50(4), 1183–1202.

    Google Scholar 

  • Zhang, D., & Nunamaker, J. F. (2003). Powering e-learning in the new millennium: An overview of e-learning and enabling technology. Information Systems Frontiers, 5(2), 207–218.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to acknowledge here the efforts of other members of the SCNC1111 teaching team and the e-learning platform development team who are not on the author list.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rachel Ka Wai Lui .

Editor information

Editors and Affiliations

Appendix

Appendix

See Table 5.7.

Table 5.7 Questionnaire items

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Liang, L., Yeung, K., Lui, R.K.W., Cheung, W.M.Y., Lam, K.F. (2018). Lessons Learned from a Calculus E-Learning System for First-Year University Students with Diverse Mathematics Backgrounds. In: Silverman, J., Hoyos, V. (eds) Distance Learning, E-Learning and Blended Learning in Mathematics Education. ICME-13 Monographs. Springer, Cham. https://doi.org/10.1007/978-3-319-90790-1_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-90790-1_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-90789-5

  • Online ISBN: 978-3-319-90790-1

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics