Skip to main content

Audience Response Systems Reimagined

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11841))

Abstract

Audience response systems (ARS) allow lecturers to run quizzes in large classes by handing to technology the time-consuming tasks of collecting and aggregating students’ answers. ARSs provide immediate feedback to lecturers and students alike. The first commercial ARSs emerged in the 1990s in form of clickers, i.e., transmitters equipped with a number of buttons, which impose restrictions on possible questions – most often, only multiple choice and numerical answers are possible.

Starting from the early 2010s, the ubiquity of smartphones, laptops, and tablet computers paved the way for web-based ARSs which, while running on technology that provides more means for input and a graphical display, still have much in common with their precursors: Even though more types of questions besides multiple choice are supported, the full capability of web-based technology is still not fully exploited. Furthermore, they also do not adapt to a student’s needs and knowledge, and often restrict quizzes to two phases: Answering a question and viewing the results.

This article first examines the current state of web-based ARSs: Question types found in current ARSs are identified and their support in a variety of ARSs is examined. Afterwards, three axes on which ARSs should advance in the future are introduced: Means of input, adaption to students, and support for multiple phases. Each axis is illustrated with concrete examples of quizzes.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://www.geogebra.org/?lang=en.

  2. 2.

    https://backstage2.pms.ifi.lmu.de:8080.

References

  1. Azevedo, R., Hadwin, A.F.: Scaffolding self-regulated learning and metacognition-implications for the design of computer-based scaffolds. Instr. Sci. 33(5), 367–379 (2005)

    Article  Google Scholar 

  2. Brusilovsky, P., Millán, E.: User models for adaptive hypermedia and adaptive educational systems. In: Brusilovsky, P., Kobsa, A., Nejdl, W. (eds.) The Adaptive Web. LNCS, vol. 4321, pp. 3–53. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-72079-9_1

    Chapter  Google Scholar 

  3. Bryfczynski, S.P., et al.: uRespond: iPad as interactive, personal response system. J. Chem. Educ. 91(3), 357–363 (2014)

    Article  Google Scholar 

  4. Caldwell, J.E.: Clickers in the large classroom: current research and best-practice tips. CBE-Life Sci. Educ. 6(1), 9–20 (2007)

    Article  Google Scholar 

  5. Cooper, M.M., Grove, N., Underwood, S.M., Klymkowsky, M.W.: Lost in Lewis structures: an investigation of student difficulties in developing representational competence. J. Chem. Educ. 87(8), 869–874 (2010)

    Article  Google Scholar 

  6. Crouch, C.H., Mazur, E.: Peer instruction: ten years of experience and results. Am. J. Phys. 69(9), 970–977 (2001)

    Article  Google Scholar 

  7. González-Tato, J., Llamas-Nistal, M., Caeiro-Rodríguez, M., Mikic-Fonte, F.A., et al.: Web-based audience response system using the educational platform called BeA. J. Res. Pract. Inf. Technol. 45(3/4), 251 (2013)

    Google Scholar 

  8. Gross, M.: Collective peer evaluation of quiz answers in large classes through pairwise matching, Institute of Informatics, Ludwig Maximilian University of Munich. Bachelor thesis (2017)

    Google Scholar 

  9. Grüner, G.: Die didaktische Reduktion als Kernstück der Didaktik. Die Deutsche Schule 59(7/8), 414–430 (1967)

    Google Scholar 

  10. Haladyna, T.M.: Writing Test Items to Evaluate Higher Order Thinking. ERIC, New York (1997)

    Google Scholar 

  11. Hattie, J., Timperley, H.: The power of feedback. Rev. Educ. Res. 77(1), 81–112 (2007)

    Article  Google Scholar 

  12. Hauswirth, M.: Informa: an extensible framework for group response systems. In: Bertino, E., Joshi, J.B.D. (eds.) CollaborateCom 2008. LNICST, vol. 10, pp. 271–286. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-03354-4_21

    Chapter  Google Scholar 

  13. Hauswirth, M.: Models and clickers for teaching computer science. In: 7th Educators’ Symposium@ MODELS (2011)

    Google Scholar 

  14. Hauswirth, M., Adamoli, A.: Teaching java programming with the informa clicker system. Sci. Comput. Program. 78(5), 499–520 (2013)

    Article  Google Scholar 

  15. Hunsu, N.J., Adesope, O., Bayly, D.J.: A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Comput. Educ. 94, 102–119 (2016)

    Article  Google Scholar 

  16. Hwang, G.J.: Definition, framework and research issues of smart learning environment - a context-aware ubiquitous learning perspective. Smart Learn. Environ. 1(1), 4 (2014)

    Article  Google Scholar 

  17. Imazeki, J.: Bring-your-own-device: turning cell phones into forces for good. J. Econ. Educ. 45(3), 240–250 (2014)

    Article  Google Scholar 

  18. Jensen, J.L., McDaniel, M.A., Woodard, S.M., Kummer, T.A.: Teaching to the test or testing to teach: exams requiring higher order thinking skills encourage greater conceptual understanding. Educ. Psychol. Rev. 26(2), 307–329 (2014)

    Article  Google Scholar 

  19. Jumaat, N.F., Tasir, Z.: Instructional scaffolding in online learning environment: a meta-analysis. In: 2014 International Conference on Teaching and Learning in Computing and Engineering, pp. 74–77. IEEE (2014)

    Google Scholar 

  20. Kay, R.H., LeSage, A.: Examining the benefits and challenges of using audience response systems: a review of the literature. Comput. Educ. 53(3), 819–827 (2009)

    Article  Google Scholar 

  21. Kulik, J.A., Kulik, C.L.C.: Timing of feedback and verbal learning. Rev. Educ. Res. 58(1), 79–97 (1988)

    Article  Google Scholar 

  22. Mader, S., Bry, F.: Phased classroom instruction: a case study on teaching programming languages. In: Proceedings of the 11th International Conference on Computer Supported Education, CSEDU, vol. 1, pp. 241–251. SciTePress (2019)

    Google Scholar 

  23. Maheady, L., Mallette, B., Harper, G.F., Sacca, K.: Heads together: a peer-mediated option for improving the academic achievement of heterogeneous learning groups. Remedial Spec. Educ. 12(2), 25–33 (1991)

    Article  Google Scholar 

  24. Martyn, M.: Clickers in the classroom: an active learning approach. Educ. Q. 30(2), 71 (2007)

    Google Scholar 

  25. McLoone, S., Brennan, C.: A smartphone-based student response system for obtaining high quality real-time feedback-evaluated in an engineering mathematics classroom: National university of ireland maynooth. Thinking Assessment in Science and Mathematics, p. 148 (2013)

    Google Scholar 

  26. Roselli, R.J., Brophy, S.P.: Experiences with formative assessment in engineering classrooms. J. Eng. Educ. 95(4), 325–333 (2006)

    Article  Google Scholar 

  27. Schön, D., Klinger, M., Kopf, S., Weigold, T., Effelsberg, W.: Customizable learning scenarios for students’ mobile devices in large university lectures: a next generation audience response system. In: Zvacek, S., Restivo, M.T., Uhomoibhi, J., Helfert, M. (eds.) CSEDU 2015. CCIS, vol. 583, pp. 189–207. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-29585-5_11

    Chapter  Google Scholar 

  28. Stanger-Hall, K.F.: Multiple-choice exams: an obstacle for higher-level thinking in introductory science classes. CBE-Life Sci. Educ. 11(3), 294–306 (2012)

    Article  Google Scholar 

  29. Staudacher, K., Mader, S., Bry, F.: Automated scaffolding and feedback for proof construction: a case study. In: Proceedings of the 18th European Conference on e-Learning (ECEL 2019). ACPI (2019, to appear)

    Google Scholar 

  30. Van Merriënboer, J.J., Kirschner, P.A., Kester, L.: Taking the load off a learner’s mind: instructional design for complex learning. Educ. Psychol. 38(1), 5–13 (2003)

    Article  Google Scholar 

  31. White, E.M.: Assessing higher-order thinking and communication skills in college graduates through writing. J. Gen. Educ. 42(2), 105–122 (1993)

    Google Scholar 

  32. Wood, D., Bruner, J.S., Ross, G.: The role of tutoring in problem solving. J. Child Psychol. Psychiatry 17(2), 89–100 (1976)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sebastian Mader .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mader, S., Bry, F. (2019). Audience Response Systems Reimagined. In: Herzog, M., Kubincová, Z., Han, P., Temperini, M. (eds) Advances in Web-Based Learning – ICWL 2019. ICWL 2019. Lecture Notes in Computer Science(), vol 11841. Springer, Cham. https://doi.org/10.1007/978-3-030-35758-0_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-35758-0_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-35757-3

  • Online ISBN: 978-3-030-35758-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics