Skip to main content

Gauging the Effectiveness of Educational Technology Integration in Education: What the Best-Quality Meta-Analyses Tell Us

  • Living reference work entry
  • Latest version View entry history
  • First Online:
Learning, Design, and Technology

Abstract

This chapter examines quantitative research in the literature of technology integration in education from the perspective of the meta-analyses of primary studies conducted from 1982 to 2015. The intent is to identify and review the best of these meta-analyses. Fifty-two meta-analyses were originally identified and evaluated for methodological quality using the Meta-Analysis Methodological Quality Review Guide (MMQRG), and the best 20 were selected and are included for review here. Some describe the effects of technology integration within specific content areas and some are more general. Technology integration in education is one of the most fluid areas of research, reflecting the incredible pace of the evolution of computer-based tools and applications. Just navigating through the vast primary empirical literature presents a real challenge to those interested in evaluating the educational effectiveness of technology. Systematic reviews in the field are numerous and quite diverse in their methodological quality, introducing potential bias in the interpretation of findings (Bernard RM, Borokhovski E, Schmid RF, Tamim RM. J Comput High Educ 26(3):183–209, 2014), thus bringing into question their applied value. This chapter identifies and reviews the best of these meta-analyses. In addition to overall statistical analyses of this collection, the findings of six of the most recent and best meta-analyses (after 2010) are summarized in more detail. The discussion focuses on the interpretation of the current findings, considers future alternatives to primary research in this area, and examines how meta-analysts might address them.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • (References marked with an * are meta-analyses in this review)

    Google Scholar 

  • *Bayraktar, S. (2000). A meta-analysis on the effectiveness of computer-assisted instruction in science education (Unpublished doctoral dissertation). Ohio University, Athens.

    Google Scholar 

  • Bernard, R. M. (2014). Things I have learned about meta-analysis since 1990: Reducing bias in search of ‘The Big Picture’ Canadian Journal of Learning and Technology, 40(3). Available from http://www.cjlt.ca/index.php/cjlt/issue/current

  • Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, A., Tamim, R., Surkes, M., … Bethel, E. C. (2009). A meta-analysis of three interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. https://doi.org/10.3102/0034654309333844

    Article  Google Scholar 

  • Bernard, R. M., Borokhovski, E., Schmid, R. F., & Tamim, R. M. (2014). An exploration of bias in meta-analysis: The case of technology integration research in higher education. Journal of Computing in Higher Education, 26(3), 183–209. https://doi.org/10.1007/s12528-014-9084-z

    Article  Google Scholar 

  • Bethel, E. C., & Bernard, R. M. (2010). Developments and trends in synthesizing diverse forms of evidence: Beyond comparisons between distance education and classroom instruction. Distance Education, 31(3), 231–256. https://doi.org/10.1080/01587919.2010.513950

    Article  Google Scholar 

  • Borenstein, M., Hedges, L., Higgins, J., & Rothstein, H. (2009). Introduction to meta-analysis. Chichester, UK: Wiley.

    Book  Google Scholar 

  • Bushman, B. J., & Wang, M. C. (2009). Vote counting methods in meta-analysis. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), Handbook of research synthesis (2nd ed., pp. 207–220). New York, NY: Russell Sage Foundation.

    Google Scholar 

  • *Cheung, A. C., & Slavin, R. E. (2012). How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review, 7(3), 198–215. https://doi.org/10.1016/j.edurev.2012.05.002

    Article  Google Scholar 

  • *Cheung, A. C. K., & Slavin, R. E. (2013). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educational Research Review, 9, 88–113. https://doi.org/10.1016/j.edurev.2013.01.001

    Article  Google Scholar 

  • Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53, 445–449. https://doi.org/10.3102/00346543053004445

    Article  Google Scholar 

  • Clark, R. E. (1994). Media will never influence learning. Educational Technology, Research and Development, 42(2), 7–29. https://doi.org/10.1007/BF02299088

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Cooper. (2017). Research synthesis and meta-analysis: A step-by-step approach (5th ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • *D’Angelo, C., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., & Haertel, G. (2014). Simulations for STEM learning: Systematic review and meta-analysis. Retrieved from SRI International website: https://www.sri.com/sites/default/files/publications/simulations-for-stem-learning-full-report.pdf

  • Friedman, L. (2001). Why vote-count reviews don’t count. Biological Psychiatry, 49(2), 161–162. https://doi.org/10.1016/S0006-3223(00)01075-1

    Article  Google Scholar 

  • *Goldberg, A., Russell, M., & Cook, A. (2003). The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002. Journal of Technology, Learning, and Assessment, 2(1). Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1661/1503

  • *Grgurović, M., Chapelle, C. A., & Shelley, M. C. (2013). A meta-analysis of effectiveness studies on computer technology-supported language learning. ReCALL, 25(2), 165–198. https://doi.org/10.1017/S0958344013000013

    Article  Google Scholar 

  • Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.

    Google Scholar 

  • Hedges, L. V., & Olkin, I. (1980). Vote counting methods in research synthesis. Psychological Bulletin, 88, 359–369. https://doi.org/10.1037/0033-2909.88.2.359

    Article  Google Scholar 

  • Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.

    Google Scholar 

  • *Hsu, Y. C. (2003). The effectiveness of computer-assisted instruction in statistics education: A meta-analysis (Unpublished doctoral dissertation). University of Arizona, Tucson.

    Google Scholar 

  • Jackson, G. B. (1980). Methods for integrative reviews. Review of Educational Research, 50, 438–460. https://doi.org/10.3102/00346543050003438

    Article  Google Scholar 

  • Kozma, R. B. (1994). Will media influence learning? Reframing the debate. Educational Technology, Research & Development, 42(2), 7–19. https://doi.org/10.1007/BF02299087

    Article  Google Scholar 

  • *Kuchler, J. M. (1998). The effectiveness of using computers to teach secondary school (grades 6–12) mathematics: A meta-analysis (Unpublished doctoral dissertation). University of Massachusetts, Lowell, MA.

    Google Scholar 

  • *Lejeune, J. V. (2002). A meta-analysis of outcomes from the use of computer-simulated experiments in science education (Unpublished doctoral dissertation). Texas A & M University, College Station.

    Google Scholar 

  • *Lin, H. (2015). A meta-synthesis of empirical research on the effectiveness of computer-mediated communication (CMC) in SLA. Language Learning & Technology, 19(2), 85–117. Retrieved from http://llt.msu.edu/issues/june2015/lin.pdf

  • Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks, CA: Sage.

    Google Scholar 

  • *Michko, G. M. (2007). A meta-analysis of the effects of teaching and learning with technology on student outcomes in undergraduate engineering education (Unpublished doctoral dissertation). University of Houston, Houston.

    Google Scholar 

  • *Onuoha, C. O. (2007). Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions (Unpublished doctoral dissertation). Capella University, Minneapolis.

    Google Scholar 

  • Polanin, J. R., Maynard, B. R., & Dell, N. A. (2017). Overviews in educational research: A systematic review and analysis. Review of Educational Research, 87(1), 172–203. https://doi.org/10.3102/0034654316631117

    Article  Google Scholar 

  • Rothstein, H. R., Sutton, A. J., & Borenstein, M. (Eds.). (2005). Publication bias in meta-analysis-prevention, assessment and adjustments. Chichester, UK: Wiley.

    Google Scholar 

  • *Schenker, J. D. (2007). The effectiveness of technology use in statistics instruction in higher education: A meta-analysis using hierarchical linear modeling (Unpublished doctoral dissertation). Kent State University, Kent.

    Google Scholar 

  • *Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes, M. A., … Woods, J. (2014). The effects of technology use in postsecondary education: A meta-analysis of classroom applications. Computers & Education, 72, 271–291. https://doi.org/10.1016/j.compedu.2013.11.002

    Article  Google Scholar 

  • Sitzmann, T. (2011). A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Personnel Psychology, 64(2), 489–528. https://doi.org/10.1111/j.1744-6570.2011.01190.x

    Article  Google Scholar 

  • *Sosa, G. W., Berger, D. E., Saw, A. T., & Mary, J. C. (2011). Effectiveness of computer-assisted instruction in statistics: A meta-analysis. Review of Educational Research, 81(1), 97–128. https://doi.org/10.3102/0034654310378174

    Article  Google Scholar 

  • *Takacs, Z. K., Swart, E. K., & Bus, A. G. (2015). Benefits and pitfalls of multimedia and interactive features in technology-enhanced storybooks: A meta-analysis. Review of Educational Research, 85, 698–739. https://doi.org/10.3102/0034654314566989

    Article  Google Scholar 

  • Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research., 81(3), 4–28. https://doi.org/10.3102/0034654310393361

    Article  Google Scholar 

  • Tamim, R. M., Borokhovski, E., Bernard, R. M., Schmid, R. F., & Abrami, P. C. (2015, April). A Methodological quality tool for meta-analysis: The case of the educational technology literature. A paper presented to the systematic review and meta-analysis SIG at the 2015 meeting of the American Educational Research Association (AERA). Chicago.

    Google Scholar 

  • *Timmerman, C. E., & Kruepke, K. A. (2006). Computer-assisted instruction, media richness, and college student performance. Communication Education, 55(1), 73–104. https://doi.org/10.1080/03634520500489666

    Article  Google Scholar 

  • *Torgerson, C. J., & Elbourne, D. (2002). A systematic review and meta-analysis of the effectiveness of information and communication technology (ICT) on the teaching of spelling. Journal of Research in Reading, 25(2), 129–143. https://doi.org/10.1111/1467-9817.00164

    Article  Google Scholar 

  • *Yaakub, M. N. (1998). Meta-analysis of the effectiveness of computer-assisted instruction in the technical education and training (Unpublished doctoral dissertation). Virginia Polytechnic Institute and State University, Blacksburg.

    Google Scholar 

  • Yettick, H. (2016). Five simple steps for reading policy research. Bolder, CO: National Educational Policy Center, University of Colorado. Retrieved from http://nepc.colorado.edu/publication/research-reading

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert M. Bernard .

Editor information

Editors and Affiliations

Appendix

Appendix

 

Item

Description

1

Research question

Is the research objective and/or question clearly stated?

2

Contextual positioning of the research problem

Is the rationale for M-A adequate, conceptually relevant, and supported by empirical evidence?

3

Time frame

Is the time frame defined and adequately justified in the context of the research question and prior reviews?

4

Experimental group

Is the experimental group clearly defined and described in detail (possibly with examples)?

5

Control group

Is the control group clearly defined and described in detail (possibly with examples)?

6

Outcomes

Are the measures of the identified outcome(s) appropriate and relevant to the research question and sufficiently described?

7

Inclusion criteria

Are the inclusion criteria clearly stated and described in detail (possibly supported by examples from the reviewed literature)?

8

Targeted literature

Is the targeted literature exhaustive and includes all types of published and unpublished literature?

9

Resources used

Are the resources used to identify relevant literature representative of the field and exhaustive (i.e., do they including multiple electronic databases, hand searches, branching, etc.)?

10

Search strategy

Is the list of search terms provided and appropriate for each individual source (e.g., modifying key words for specific databases)?

11

Article review

Is the article review process implemented by two or more researchers with reasonable inter-rater reliability level?

12

Effect size extraction

Is the effect size extraction process implemented by two or more researchers with reasonable inter-rater reliability level?

13

Study feature coding

Is the study feature coding process implemented by two or more researchers with reasonable inter-rater reliability?

14

Validity of included studies

Are all aspects of validity explicitly discussed defined and consistently addressed across studies?

15

Independence of data

Is the issue of dependency among included studies addressed with method(s) for assuring data independence being appropriate and adequately described?

16

Effect size metrics and extraction procedures

Are the used effect size metrics and extraction procedures appropriate and fully described including necessary transformations?

17

Publication Bias

Are procedures for addressing publication bias adequately substantiated and reported?

18

Treatment of outliers

Are criteria and procedures for identifying and treating outliers adequately substantiated and reported?

19

Overall analyses

Is the overall analysis performed according to standard procedures (e.g., correct model use, homogeneity assessed, standard errors reported, confidence intervals reported)?

20

Moderator variable analyses:

Are moderator variable analyses performed according to the proper analytical model and is appropriate information reported (e.g., Q-between, test statistics provided)?

21

Reporting results

Are the appropriate statistics supplied for all analyses and explained in enough detail that the reader will understand the findings?

22

Appropriate interpretation

Are the findings summarized and interpreted appropriately in relation with the research question?

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Bernard, R.M., Borokhovski, E., Schmid, R.F., Tamim, R.M. (2018). Gauging the Effectiveness of Educational Technology Integration in Education: What the Best-Quality Meta-Analyses Tell Us. In: Spector, M., Lockee, B., Childress, M. (eds) Learning, Design, and Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-17727-4_109-2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-17727-4_109-2

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-17727-4

  • Online ISBN: 978-3-319-17727-4

  • eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education

Publish with us

Policies and ethics

Chapter history

  1. Latest

    Gauging the Effectiveness of Educational Technology Integration in Education: What the Best-Quality Meta-Analyses Tell Us
    Published:
    14 June 2018

    DOI: https://doi.org/10.1007/978-3-319-17727-4_109-2

  2. Original

    Gauging the Effectiveness of Educational Technology Integration in Education: What the Best-Quality Meta-Analyses Tell Us
    Published:
    19 April 2018

    DOI: https://doi.org/10.1007/978-3-319-17727-4_109-1