Meta-analytic Decisions and Reliability: A Serendipitous Case of Three Independent Telecommuting Meta-analyses
- 373 Downloads
Despite the potential for researcher decisions to negatively impact the reliability of meta-analysis, very few methodological studies have examined this possibility. The present study compared three independent and concurrent telecommuting meta-analyses in order to determine how researcher decisions affected the process and findings of these studies.
A case study methodology was used, in which three recent telecommuting meta-analyses were re-examined and compared using the process model developed by Wanous et al. (J Appl Psychol 74:259–264, 1989).
Results demonstrated important ways in which researcher decisions converged and diverged at stages of the meta-analytic process. The influence of researcher divergence on meta-analytic findings was neither evident in all cases, nor straightforward. Most notably, the overall effects of telecommuting across a range of employee outcomes were generally consistent across the meta-analyses, despite substantial differences in meta-analytic samples.
Results suggest that the effect of researcher decisions on meta-analytic findings may be largely indirect, such as when early decisions guide the specific moderation tests that can be undertaken at later stages. However, directly comparable “main effect” findings appeared to be more robust to divergence in researcher decisions. These results provide tentative positive evidence regarding the reliability of meta-analytic methods and suggest targeted areas for future methodological studies.
This study presents unique insight into a methodological issue that has not received adequate research attention, yet has potential implications for the reliability and validity of meta-analysis as a method.
KeywordsMeta-analysis Methodological Replication Reliability Validity Telecommuting
We would like to thank Boris Baltes and Christopher Berry for their constructive comments on a previous draft of this manuscript.
- Aguinis, H., Dalton, D. A., Bosco, F. A., Pierce, C. A., & Dalton, C. M (2009, August). Meta-analytic choices and judgment calls: Implications for theory and scholarly impact. Paper presented at the meeting of the Academy of Management, Chicago, IL.Google Scholar
- Allen, M., & Preiss, R. (1993). Replication and meta-analysis: A necessary connection. Journal of Social Behavior and Personality, 8, 9–20.Google Scholar
- Beaman, A. L. (1991). An empirical comparison of meta-analytic and traditional reviews. Personality and Social Psychology Bulletin. Special Issue: Meta-Analysis in Personality and Social Psychology, 17, 252–257.Google Scholar
- Burke, M. J., & Landis, R. S. (2003). Methodological and conceptual challenges in conducting and interpreting meta-analyses. In K. R. Murphy (Ed.), Validity generalization: A critical review (pp. 287–310). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
- Cortina, J. M. (2002). Big things have small beginnings: An assortment of “minor” methodological misunderstandings. Journal of Management, 28, 339–362.Google Scholar
- Cortina, J. M., & Dunlap, W. P. (1997). On the logic and purpose of significance testing. Journal of Applied Psychology, 2, 161–172.Google Scholar
- Cree, L. H. (1999). Work/family balance of telecommuters. Dissertation Abstracts International Section B: The Sciences and Engineering, 59(11-B), 6100.Google Scholar
- Eden, D. (2002). Replication, meta-analysis, scientific progress, and AMJ’s publication policy. Academy of Management Journal, 45, 841–846.Google Scholar
- Greenhouse, J. B., & Iyengar, S. (1994). Sensitivity analysis and diagnostics. In H. M. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 503–520). New York: Russell Sage Foundation.Google Scholar
- Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.Google Scholar
- Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings (2nd ed.). Thousand Oaks, CA: Sage.Google Scholar
- Konstantopoulos, S., & Hedges, L. V. (2009). Analyzing effect sizes: Fixed-effects models. In H. M. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 279–293). New York: Russell Sage Foundation.Google Scholar
- Nicklin, J. M., Mayfield, C. O., Caputo, P. M., Arboleda, M. A., Cosentino, R. E., Lee, M., et al. (2009). Does telecommuting increase organizational attitudes and outcomes: A meta-analysis. Pravara Management Review, 8, 2–16.Google Scholar
- Nieminen, L. R. G., Chakrabarti, M., McClure, T. K., & Baltes, B. B. (2008). A meta-analysis of the effects of telecommuting on employee outcomes. Paper presented at the 23rd Annual Conference of the Society for Industrial and Organizational Psychology, San Francisco, CA.Google Scholar
- Oyer, E. J. (1997). Validity and impact of meta-analyses in early intervention research. Dissertation Abstracts International Section A: Humanities and Social Sciences. 57(7-A), 2859.Google Scholar
- Rothstein, H. R., & McDaniel, M. A. (1989). Guidelines for conducting and reporting meta-analyses. Psychological Reports, 65, 759–770.Google Scholar
- Rothstein, H., Sutton, A. J., & Bornstein, M. (Eds.). (2005). Publication bias in meta-analysis: Prevention, assessment, and adjustments. Chichester, UK: Wiley.Google Scholar
- Schulze, R. (2004). Meta-analysis: A comparison of approaches. Cambridge: Hogrefe & Huber.Google Scholar
- Schulze, R. (2007). Current methods for meta-analysis: Approaches, issues, and developments. Journal of Psychology. Special Issue: The State and the Art of Meta-Analysis, 215(2), 90–103.Google Scholar
- Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton-Mifflin.Google Scholar
- Staples, D. S. (2001). A study of remote workers and their differences from non-remote workers. Journal of End User Computing, 13, 3–14.Google Scholar