Advertisement

Education and Information Technologies

, Volume 24, Issue 2, pp 1147–1171 | Cite as

Engineering assessment strata: A layered approach to evaluation spanning Bloom’s taxonomy of learning

  • Ronald F. DeMaraEmail author
  • Tian Tian
  • Wendy Howard
Article
  • 214 Downloads

Abstract

Fostering metacognition can be challenging within large enrollment settings, particularly within STEM fields concentrating on problem-solving skills and their underlying theories. Herein, the research problem of realizing more frequent, insightful, and explicitly-rewarded metacognition activities at significant scale is investigated via a strategy utilizing a hierarchy of assessments. Referred to as the STEM-Optimal Digitized Assessment Strategy (SODAS), this targeted approach engages frequent assessment, instructor feedback, and learner self-reflection across the hierarchy of learning mechanisms comprising Bloom’s Taxonomy of Learning Domains. SODAS spans this hierarchy of learning mechanisms via a progression of (i) unregulated online assessment, (ii) proctored Computer-Based Assessment (CBA), (iii) problem-based learning activities assessed in the laboratory setting, and (iv) personalized Socratic discussions of scanned scrap sheets that accompanied each learner’s machine-graded formative assessments. Results of a case study integrating SODAS within a high-enrollment Mechanical Engineering Heat Transfer course at a large state university are presented for enrollment of 118 students. Six question types were delivered with lockdown proctored testing via auto-grading within the Canvas Learning Management System (LMS), along with bi-weekly laboratory activities to address the higher layers of Bloom’s Taxonomy. Sample assessment formats were validated through student use and schedules of responsibilities for instructors across four tiers of assessment levels (facts, concepts, procedures, and metacognition), two testing delivery mechanisms (electronic textbook exercises and proctored CBA), and three remediation mechanisms (self-paced, score clarification, and experiment clarification), which showed that learning achievement can increase by up to 16.9% compared to conventional assessment strategies, while utilizing comparable instructor resources and workloads.

Keywords

STEM education Degree productivity and quality Computer-based assessment Rapid remediation Asynchronous testing Lockdown proctored assessment 

Notes

Acknowledgements

The authors acknowledge the facilities, equipment, and support of the UCF College of Engineering and Computer Science, and the State University System of Florida’s Information Technology Program Performance Initiative.

References

  1. Atoum, Y., Chen, L., Liu, A. X., Hsu, S. D., & Liu, X. (2017). Automated online exam proctoring. IEEE Transactions on Multimedia, 19(7), 1609–1624.CrossRefGoogle Scholar
  2. Brown, A. L., & Ferrara, R. A. (1999). Diagnosing zones of proximal development. L. Vygotsky: Critical assessments: The zones of proximal development, 3, 225–256.Google Scholar
  3. Bruner, J. S. (1966). Toward a theory of instruction (Vol. 59): Harvard University Press.Google Scholar
  4. Chen, B., West, M., & Zilles, C. (2017). Do Performance Trends Suggest Wide-spread Collaborative Cheating on Asynchronous Exams? Paper presented at the proceedings of the fourth (2017) ACM Coference on learning@ scale.Google Scholar
  5. Chen, B., Bastedo, K., & Howard, W. (2018a). Exploring design elements for online STEM courses: Active learning, engagement & assessment design. Online Learning, 22(2), 59–75. doi: https://doi.org/10.24059/olj.v22i2.1369
  6. Chen, B., DeMara, R. F., Salehi, S., & Hartshorne, R. (2018b). Elevating learner achievement using formative electronic lab assessments in the Engineering Laboratory: A viable alternative to weekly lab reports. IEEE Transactions on Education, 61(1), 1–10.CrossRefGoogle Scholar
  7. DeMara, R. F., Khoshavi, N., Pyle, S., Edison, J., Hartshorne, R., Chen, B., & Georgiopoulos, M. (2016), Redesigning Computer Engineering Gateway Courses Using a Novel Remediation Hierarchy. In Proceedings of American Association for Engineering Education Annual Conference and Exhibition, New Orleans, LA.Google Scholar
  8. Gabriel, J. (2009). College testing lab: New name. UCF Today: Upgrades.Google Scholar
  9. Hillier, M. (2014). The very idea of e-exams: Student (pre) conceptions. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education Conference.Google Scholar
  10. Holland, J. M., Major, D. A., & Orvis, K. A. (2012). Understanding how peer mentoring and capitalization link STEM students to their majors. The Career Development Quarterly, 60(4), 343–354.CrossRefGoogle Scholar
  11. Jamil, M., Tariq, R., Shami, P., & Zakariys, B. (2012). Computer-based vs paper-based examinations: Perceptions of university teachers. TOJET: The Turkish Online Journal of Educational Technology, 11(4).Google Scholar
  12. Jansson, P. M., Ramachandran, R. P., Schmalzel, J. L., & Mandayam, S. (2010). Creating an agile ECE learning environment through engineering clinics. Education, IEEE Transactions on, 53(3), 455–462.CrossRefGoogle Scholar
  13. Kaya, B. Y., Kaya, G., & Dağdeviren, M. (2014). A sample application of web based examination system for distance and formal education. Procedia-Social and Behavioral Sciences, 141, 1357–1362.CrossRefGoogle Scholar
  14. Khan, S., & Khan, R. A. (2018). Online assessments: Exploring perspectives of university students. Education and Information Technologies, 1–17.Google Scholar
  15. Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory Into Practice, 41(4), 212–218.CrossRefGoogle Scholar
  16. Malau-Aduli, B. S., Assenheimer, D., Choi-Lundberg, D., & Zimitat, C. (2014). Using computer-based technology to improve feedback to staff and students on MCQ assessments. Innovations in Education and Teaching International, 51(5), 510–522.  https://doi.org/10.1080/14703297.2013.796711.CrossRefGoogle Scholar
  17. Meldrum, A. (2013). Using online testing for engineering studies. Engineering Education, 8(2), 77–89.CrossRefGoogle Scholar
  18. Moskal, P., Caldwell, R., & Ellis, T. (2009). Evolution of a computer-based testing laboratory. Innovate: Journal of Online Education, 5(6), 6.Google Scholar
  19. Novak, E., Daday, J., & McDaniel, K. (2018). Assessing intrinsic and extraneous cognitive complexity of E-textbook learning. Interacting with Computers, 30(2), 150–161.CrossRefGoogle Scholar
  20. Piaget, J. (1970). Science of education and psychology of the child. New York: Oxford University Press.Google Scholar
  21. Rawson, K. A. (2015). The status of the testing effect for complex materials: Still a winner. Educational Psychology Review, 27(2), 327–331.CrossRefGoogle Scholar
  22. Singh, S. K., & Tiwari, A. K. (2016). Design and implementation of secure computer based examination system based on b/s structure. International Journal of Applied Engineering Research, 11(1), 312–318.Google Scholar
  23. Sithole, A., Chiyaka, E. T., McCarthy, P., Mupinga, D. M., Bucklein, B. K., & Kibirige, J. (2017). Student attraction, persistence and retention in STEM programs: Successes and continuing challenges. Higher Education Studies, 7(1), 46. Retrieved from.  https://doi.org/10.5539/hes.v7n1p46S.CrossRefGoogle Scholar
  24. Snyder, T. D., de Brey, C., & Dillow, S. A. (2016). Digest of Education Statistics 2014, NCES 2016-006. National Center for Education Statistics, Table, 325, 47.Google Scholar
  25. Tao, L. & Zhang, M. (2013). Understanding an Online Classroom System: Design and Implementation Based on a Model Blending Pedagogy and HCI. IEEE Transactions on Human-Machine Systems, 43(5), 465-478.Google Scholar
  26. Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computers & Education, 34(1), 37–49.CrossRefGoogle Scholar
  27. Tian, T., & DeMara, R. F. (2018). Matrix-organized instructional delivery for scaling-up problem-based learning through reallocation of instructional support. In Proceedings of American Society for Engineering Education Southeastern Conference, Daytona: Beach, FL.Google Scholar
  28. Tian, T., DeMara, R. F. & Gao, S. (2018). Lockdown computerized testing interwoven with rapid remediation: A crossover study within a mechanical engineering Core course. In Proceedings of IEEE Frontiers in Education Conference, San Jose, CA.Google Scholar
  29. Vajravelu, K., & Muhs, T. (2016). Integration of digital technology and innovative strategies for learning and teaching large classes: A calculus case study. International Journal of Research In Education and Science (IJRES), 2(2), 379–395 Retrieved from http://files.eric.ed.gov/fulltext/EJ1105125.pdf.CrossRefGoogle Scholar
  30. Vygotsky, L. S. (1978). Mind in society: The development of higher mental process (pp. 130-133). Cambridge, MA: Harvard University Press.Google Scholar
  31. Yuan, Z., Zhang, L., & Zhan, G. (2003). A novel web-based online examination system for computer science education. Paper presented at the 33rd ASEE/IEEE Frontiers in Education Conference.Google Scholar
  32. Zhenming, Y, Liang, Z, & Guohua, Z. (2003). A novel web-based online examination system for computer science education.33rd ASEE/IEEE Frontiers in Education Conference, Boulder, CO.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of Central FloridaOrlandoUSA
  2. 2.Department of Mechanical and Aerospace EngineeringUniversity of Central FloridaOrlandoUSA
  3. 3.Pegasus Innovation LabUniversity of Central FloridaOrlandoUSA

Personalised recommendations