Abstract
Based on their experiences from their work with two national initiatives designed to reform educational practice in U.S., the authors present seven guiding principles of evidence-based/informed educational policy and research to lay the foundation for making rigorous and comprehensive judgments about what evidence and scientific research designs should be taken into account when scaling-up educational reforms to serve the public good . The authors further provide case examples from US with a clear potential to both utilize and generate evidence in the public interest including educational research studies that seeks to support underrepresented groups in preparing for and achieving successful transitions to postsecondary education and careers, in STEM and other fields. The authors conclude that educational researchers have a critical role to play in providing decision-makers with the tools to judge the evidence to serve public good .
Mistaking no answers in practice for no answers in principle is a great source of moral confusion – Sam Harris
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
American Recovery and Reinvestment Act. (Pub.L.11-5); Gates Foundation: http://www.gatesfoundation.org/united-states/Pages/measures-of-effective-teaching-fact-sheet.aspx
- 2.
Results of the 2009 NAEP for U.S. high school seniors found no significant changes in the gap between white and black students’ reading scores from 1992 to 2009, and no significant change between white and black or Hispanic students’ mathematics scores from 2005 to 2009 (NCES 2011).
- 3.
KIPP (http://www.kipp.org/) is “based around high expectations for student achievement; commitment to a college preparatory education by students, parents, and faculty; devotion of time to both educational and extracurricular activities; increased leadership power of school principals; and a focus on results through regular student assessments” (U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse 2010). Urban Prep is a Chicago-based initiative operating in the only all-male public schools in the state of Illinois to “provide a comprehensive, high-quality college preparatory education that results in graduates succeeding in college” (see http://www.urbanprep.org/about/historvlindex.asp).
- 4.
See, Dynarski and Scott-Clayton (2007) and Hoxby (2007). Other examples of online resources on the college selection and application processes in the U.S. include the National Center for Education Statistics College Navigator (http://nces.ed.gov/collegenavigator) and the American Council on Education, Lumina Foundation for Education, and Ad Council’s KnowHow2GO (http://www.knowhow2go.org/).
- 5.
See the Success for All Foundation’s ‘Our Story’, retrieved February 22, 2011 from http://www.successforall.org/About/story.html
- 6.
The What Works Clearinghouse is an initiative of the U.S. Department of Education’s Institute of Education Sciences which ‘develops and implements standards for reviewing and synthesizing education research’ (http://ies.ed.gov/ncee/wwc/aboutus). The Campbell Collaboration is an ‘international research network that produces systematic reviews of the effects of social interventions’ (http://www.campbellcollaboration.org/aboutus/index.php). The Society for Research on Educational Effectiveness seeks to advance and disseminate research on the causal effects of education interventions, programs, and policy (http://www.sree.org/pages/mission.php).
- 7.
Anderson’s original Adaptive Control of Thought (ACT) theory of human cognition was first described in Anderson, 1976; elaborated in 1983; and refined into the ACT-R (Adaptive Control of Thought-Rational) theory for understanding and stimulating cognition, 1993, which is the foundation of the Cognitive Tutor software.
- 8.
For additional information see Ritter et al. (2007a, b). For a review of this study, see the WWC July 2009 Intervention Report on the Cognitive Tutor® Algebra I available online at http://ies.ed.gov/ncee/wwc/pdf/wwccogtutor072809.pdf
- 9.
For additional information on BioKIDS see the project’s web site at http://www.biokids.umich.edu/
- 10.
The Principled Assessment Designs for Inquiry (PADI) project builds on developments in measurement theory, technology, cognitive psychology, and science inquiry, implementing the evidence-centered assessment design (ECD) framework (see http://padi.sri.com). For additional information on the BioKIDS/PADI collaboration and details of the assessment system, see Songer et al. (2009), and Gotwals and Songer (2006).
- 11.
For additional information on the TPRI see Foorman et al. (1998) and Foorman et al. (2007); and the web site at http://www.childrensleaminginstitute.org/ourprograms/program-overview/TPRI/. For information on FAIR see Foorman and Petscher (2010) and Foorman et al. (2009); and the web site at http://www.fcrr.org/fair/index.shtm
- 12.
For a complete listing of current research projects being conducted by research faculty at the Florida Center for Reading Research, see http://www.fcrr.org/centerResearch/centerResearch.shtm
- 13.
For a detailed description of the Schools and Staffing Survey, including copies of instrumentation administered in 1987–1988 m 1990–1991, 1993–1994, 1999–2000, 2003–2004, and 2007–2008, see the National Center for Education Statistics online at http://nces.ed.gov/surveys/sass/index.asp
- 14.
For information about the SimCalc intervention and the scaling-up SimCalc study, see the Kaput Center for Research and Innovation in STEM Education (http://www.kaputcenter.umassd.edu/projects/simcalc), the SRI International Scaling Up SimCalc project website (at http://math.sri.com/index.html), and Roschelle et al. (2010b).
- 15.
Specifically, using a method and a propensity score sub classification estimator introduced by O’Muircheartaigh and Hedges reduced “bias in the estimate of a population average treatment effect” and identified “the portion of a population for which an experiment can generalize with fewer costs in terms [of] bias, variance, and extrapolation” (Tipton 2011: 4).
- 16.
For additional information on the TEACH (Training Early Achievers for Careers in Health) Research program see http://chess.uchicago.edu/TEACH
- 17.
For additional information on the College Ambition Program and the NSF-supported Transforming Interests in STEM Careers (TISC) study evaluating its impacts see the program website at http://collegeambition.org/
References
American Educational Research Association (AERA). (2016). Retrieved from homepage: https://www.aera.org
American Recovery and Reinvestment Act of 2009 (ARRA). (2009, February 19). Pub. L. No. 111–5, 123 Stat. 115, 516.
BioKIDS: Kids’ Inquiry of Diverse Species. (2005). Retrieved from http://www.biokids.umich.edu
Bohrnstedt, G. W., & Stecher, B. M. (2002). What we have learned about class size reduction in California. Sacramento: California Department of Education.
Bohrnstedt, G., Stecher, B., & Wiley, E. (2000). The California class size reduction evaluation: Lessons learned. In How small classes help teachers do their best (pp. 201–226). Philadelphia: Temple University Center for Research in Human Development and Education.
Borman, G. D., Hewes, G. M., Overman, L. T., & Brown, S. (2003). Comprehensive school reform and achievement: A meta-analysis. Review of Educational Research, 73(2), 125–230. doi:10.3102/00346543073002125.
Borman, G., Slavin, R. E., Cheung, A., Chamberlain, A., Madden, N. A., & Chambers, B. (2007). Final reading outcomes of the national randomized field trial of success for all. American Educational Research Journal, 44(3), 701–731. doi:10.3102/0002831207306743.
Bryk, A. S. (2015). Accelerating how we learn to improve. Educational Researcher, 44(9), 467–478.
Campbell Collaboration: Vision, Mission, and Key Principles. (2016). Retrieved from https://www.campbellcollaboration.org/vision-mission-and-principle/explore/our-key-principles
Campuzano, L., Dynarski, M., Agodini, R., & Rall, K. (2009). Effectiveness of reading and mathematics software products: Findings from two student cohorts. NCEE 2009–4041. National Center for Education Evaluation and Regional Assistance.
Children’s Learning Institute (CLI) at The University of Texas-Houston Health Science Center and the Texas Institute for Measurement, Evaluation, and Statistics (TIMES) Technical Report TPRI (2010–2014 Edition). Retrieved from: http://tpri.org/resources/documents/20102014TechnicalReport.pdf
Children’s Learning Institute: TPRI Early Reading Assessment. (2015). Retrieved from https://www.childrenslearninginstitute.org/resources/tpri-early-reading-assessment
Chmielewski, A. K. (2014). An international comparison of achievement inequality in within-and between-school tracking systems. American Journal of Education, 120(3), 293–324.
College Ambition Program. (2016). Retrieved from homepage: http://collegeambition.org
Duncan, G. J., & Murnane, R. J. (Eds.). (2011). Whither opportunity?: Rising inequality, schools, and children’s life chances. New York: Russell Sage Foundation.
Dynarski, S., & Scott-Clayton, J. E. (2007). The feasibility of streamlining aid for college using the tax system. In National Tax Association papers and proceedings (vol. 99, pp. 250–262).
Every Student Succeeds Act of 2015, S. 1177, 114th Cong. (2015). Washington, DC: US Department of Education. Public Law 114–95.
Florida Center for Reading Research. (2008). Retrieved from homepage: http://www.fcrr.org/centerResearch/centerResearch.shtm
Foorman, B. R., & Petscher, Y. (2010). Development of spelling and differential relations to text reading in grades 3–12. Assessment for Effective Intervention, 36(1), 7–20.
Foorman, B. R., Fletcher, J. M., Frances, D. J., Carlson, C. D., Chen, D., & Mouzaki, A. (1998). Technical report: Texas primary reading inventory (1998th ed.). Houston: Center for Academic and Reading Skills and the University of Houston.
Foorman, B., Santi, K., & Berger, L. (2007). Scaling assessment-driven instruction using the internet and handheld computers. In B. Schneider & S. K. McDonald (Eds.), Scale-up in education (pp. 68–90). Plymouth: Rowman & Littlefield Publishers.
Foorman, B., Torgesen, J., Crawford, E., & Petscher, Y. (2009). Assessments to guide reading instruction in K-12: Decisions supported by the new Florida system. Perspectives on Language and Literacy, 35(5), 13–19.
Gates Foundation. (2016). Retrieved from homepage: http://www.gatesfoundation.org
Gotwals, A. W., & Songer, N. B. (2006). Measuring students’ scientific content and inquiry reasoning. In Proceedings of the 7th international conference on learning sciences (pp. 196–202). International Society of the Learning Sciences.
Harris, S. (2016). Retrieved from homepage: https://www.samharris.org
Hedges, L. V. (1981). Distribution theory for Glass's estimator of effect size and related estimators. Journal of Educational and Behavioral Statistics, 6(2), 107–128.
Hedges, L. V. (2013). Recommendations for practice: Justifying claims of generalizability. Educational Psychology Review, 25(3), 331–337. doi:1040-726X.
Holland, P. W. (1986). Statistics and causal inference. Journal of the American Statistical Association, 81(396), 945–960.
Hoxby, C. M. (Ed.). (2007). College choices: The economics of where to go, when to go, and how to pay for it. Chicago: University of Chicago Press.
Imbens, G. W., & Rubin, D. B. (2010). Rubin causal model. In S. N. Durlauf & L. E. Blume (Eds.), Microeconometrics (pp. 229–241). New York: Macmillan.
Ingersoll, R. M. (1998). The problem of out-of-field teaching. The Phi Delta Kappan, 79(10), 773–776.
Ingersoll, R. M. (1999). The problem of underqualified teachers in American secondary schools. Educational Researcher, 28(2), 26–37.
Ingersoll, R. M. (2004). Why do high-poverty schools have difficulty staffing their classrooms with qualified teachers? Center for American Progress, Institute for America’s Future.
Ingersoll, R. M., Han, M., & Bobbitt, S. (1995). Teacher supply, teacher qualifications, and teacher turnover: 1990–1991 (pp. 95–744). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement, National Center for Education Statistics, NCES.
Kaput Center for Research and Innovation in STEM Education. (2016). University of Massachusetts, Dartmouth. Retrieved from: http://www.kaputcenter.umassd.edu/projects/simcalc
KIPP: About KIPP. (2016). Retrieved from homepage: http://www.kipp.org
KnowHow2Go. (2013). Retrieved from homepage: http://www.knowhow2go.org
Milesi, C., Brown, K., Hawkley, L., Dropkin, E., & Schneider, B. (2014). Charting the impact of federal spending for education research: A bibliometric approach. Educational Researcher, 43(7), 361–370. doi:10.3102/0013189X14554002.
National Center for Education Statistics. (2011a). College Navigator. Retrieved from http://nces.ed.gov/collegenavigator
National Center for Education Statistics. (2011b). The nation’s report card: Reading 2011 (NCES 2012–457). Washington, DC: Institute of Education Sciences, U.S. Department of Education.
National Center for Education Statistics. (2016). Schools and Staffing Survey (SASS). Washington, DC: Institute of Education Sciences, U.S. Department of Education. Retrieved from: http://nces.ed.gov/surveys/sass/index.asp.
National Research Council. (2002). Scientific research in education. Committee on Scientific Principles for Education Research. In R. J. Shavelson & L. Towne (Eds.), Center for education, division of behavioral and social sciences and education. Washington, DC: National Academy Press.
National Science Foundation. (2010a). Preparing the next generation of stem innovators: Identifying and developing our nation’s human capital. National Science Foundation. Retrieved from: https://www.nsf.gov/nsb/publications/2010/nsb1033.pdf
National Science Foundation. (2010b). Research and Evaluation on Education in Science and Engineering (REESE). Program Solicitation. Retrieved from: http://www.nsf.gov/pubs/2010/nsf10586/nsf10586.pdf
No Child Left Behind Act of 2002, S. 1115, 107th Cong. (2002). Washington, DC: US Department of Education. Public Law 107–110.
O’Muircheartaigh, C., & Hedges, L. V. (2014). Generalizing from unrepresentative experiments: A stratified propensity score approach. Journal of the Royal Statistical Society, Series C, 63(2), 195–210. doi:10.1111/rssc.12037.
Organization for Economic Co-operation and Development, OECD/EU. (2015). Indicators of Immigrant Integration 2015: Settling In, OECD Publishing, Paris. 1–348. doi:http://dx.doi.org/10.1787/9789264234024-en
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC, National Academies Press.
Pellegrino, J. W., Wilson, M. R., Koenig, J. A., & Beatty, A. S. (Eds.). (2014). Developing assessments for the next generation science standards. Washington, DC: National Academies Press.
Principled Assessment Designs for Inquiry. (2003). Retrieved from homepage: http://padi.sri.com
Quint, J., Zhu, P., Balu, R., Rappaport, S., & DeLaurentis, M. (2015). Scaling up the success for all model of school reform: Final report from the investing in innovation (i3) evaluation. New York: MDRC.
Reardon, S. F. (2011). The widening academic achievement gap between the rich and the poor: New evidence and possible explanations. In R. Murnane, & G. Duncan (Eds.), Whither opportunity? Rising inequality and the uncertain life chances of low-income children. New York: Russell Sage Foundation.
Ritter, S., Anderson, J. R., Koedinger, K. R., & Corbett, A. (2007a). Cognitive Tutor: Applied research in mathematics education. Psychonomic Bulletin & Review, 14(2), 249–255.
Ritter, S., Kulikowich, J., Lei, P. W., McGuire, C. L., & Morgan, P. (2007b). What evidence matters? A randomized field trial of Cognitive Tutor Algebra I. Frontiers in Artificial Intelligence and Applications, 162, 13.
Roschelle, J., Tatar, D., Shechtman, N., Hegedus, S., Hopkins, B., Knudsen, J., & Stroter, A. (2007). Can a technology-enhanced curriculum improve student learning of important mathematics. Results from 7th grade, year, 1.
Roschelle, J., Shechtman, N., Tatar, D., Hegedus, S., Hopkins, B., Empson, S., Knudsen, J., & Gallagher, L. P. (2010a). Integration of technology, curriculum, and professional development for advancing middle school mathematics three large-scale studies. American Educational Research Journal, 47(4), 833–878.
Roschelle, J., Tatar, D., Hedges, L., & Shechtman, N. (2010b). Two perspectives on the generalizability of lessons from scaling up SimCalc. Society for Research on Educational Effectiveness.
Rowan, B., Correnti, R., Miller, R., & Camburn, E. (2009). School improvement by design: Lessons from a study of comprehensive school reform programs. Consortium for Policy Research in Education, 1–62. doi:10.12698/cpre.2009.sii.
Rubin, B. (2005). Bayesian inference for causal effects. In C. R. Rao & D. K. Dey (Eds.), Handbook of statistics, volume 25: Bayesian thinking: Modeling and computation (pp. 1–16). Amsterdam: Elsevier.
Schanzenbach, D. W. (2006). What have researchers learned from Project STAR? Brookings Papers on Education Policy, 9, 205–228.
Schneider, B. (2015). 2014 AERA Presidential Address, The College Ambition Program: A realistic transition strategy for traditionally disadvantaged students. Educational Researcher, 44(7), 394–403.
Schneider, B., & McDonald, S. K. (2007). Scale-up in practice: An introduction. In B. Schneider & S. K. McDonald (Eds.), Scale-up in education: Vol. 2: Issues in practice (pp. 1–12). Lanham: Rowman & Littlefield.
Schneider, B., & Stevenson, D. (1999). The ambitious generation: America’s teenagers, motivated but directionless. New Haven: Yale University Press.
SimCalc, the mathematics of change. (2011). Retrieved from: http://math.sri.com/index.html
Society for Research on Educational Effectiveness (SREE): Mission. (2010). Retrieved from https://www.sree.org/pages/mission.php
Songer, N. B., Myers, P., & Gotwals, A. W. (2007). DeepThink: Fostering and measuring learning progressions focused on deep thinking about biodiversity. Poster presented at the Principal Investigators Meeting of the National Science Foundation, Washington, DC.
Songer, N. B., Kelcey, B., & Gotwals, A. W. (2009). How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning about biodiversity. Journal of Research in Science Teaching, 46(6), 610–631.
Stevenson, D. L. (2000). The fit and misfit of sociological research and educational policy. In M. T. Hallinin (Ed.), Handbook of the sociology of education (pp. 547–563). Springer US.
Success for All Foundation: Our Story. (2005). Retrieved from http://www.successforall.org/who-we-are
The Center for Health and the Social Sciences. (2016). High school students: Training early achievers for careers in heatlh (TEACH). The University of Chicago. Retrieved from: http://chess.uchicago.edu/TEACH
Tipton, E. (2011). Improving the external validity of randomized experiments using propensity score subclassification. Working Paper.
Tipton, E. (2014). How generalizable is your experiment? An index for comparing experimental samples and populations. Journal of Educational and Behavioral Statistics, 39(6), 478–501.
Tipton, E., Hedges, L. V., Borman, G., Vaden-Kiernan, M., Caverly, S., & Sullivan, K. (2014). Sample selection in randomized experiments: A new method using propensity score stratified sampling. Journal of Research on Educational Effectiveness, 7(1), 114–135. doi:10.1080/19345747.2013.831154.
U.K., House of Commons. (2006). Science and technology committee: scientific advice, risk and evidence based policy making (Vol. 1). House of Commons.
U.S. Department of Labor. (2011). A profile of the working poor, 2009. U.S. Department of Labor, U.W. Bureau of Labor Statistics. March 2011. Retrieved from: http://www.bls.gov/opub/reports/working-poor/archive/workingpoor_2009.pdf
Urban Prep Academies: History. (2012). Retrieved from: http://www.urbanprep.org/about/history-creed
Walters, P. B., Lareau, A., & Ranis, S. (Eds.). (2008). Education research on trial. Taylor & Francis.
Weiss, C. H. (1982). Policy research in the context of diffuse decision making. The Journal of Higher Education, 53, 619–639.
Weiss, C. H. (1989). Congressional committees as users of analysis. Journal of Policy Analysis and Management, 8(3), 411–431.
What Works Clearinghouse. (2009). Intervention report: Cognitive tutor algebra I. Retrieved from https://www.mbaea.org/documents/filelibrary/pdf/cognitive_tutor/WWC_CogTutor_Report_July2009_B2A3C279D0481.pdf
What Works Clearinghouse. (2010). What works clearinghouse: Quick review of the report “Student Characteristics and Achievement in 22 KIPP Middle Schools. U.S. Department of Education, Institute of Education Sciences. Retrieved from: http://ies.ed.gov/ncee/wwc/Docs/QuickReview/kipp_092110.pdf
What Works Clearinghouse. (2016). What we do. Retrieved from: http://ies.ed.gov/ncee/wwc/WhatWeDo
Word, E., Johnston, J., Bain, H., Fulton, B. D., Zaharias, J. B., Achilles, C. M., Lintz, M. N. Folger, J. & Breda, C. (1990). The State of Tennessee’s student/teacher achievement ratio (STAR) Project. Tennessee Board of Education.
World Education Research Association (WERA). (2016). Retrieved from homepage: https://www.wera.org
Yamada, H., & Bryk, A. S. (2016). Assessing the first two years’ effectiveness of statway® a multilevel model with propensity score matching. Community College Review. 0091552116643162.
Acknowledgement
This material is based upon work supported by the National Science Foundation under awards: No. DRL-131672 (CAP), No. OISE-1545684 (PIRE), and No. DRL-0815295 (ARC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
McDonald, S.K., Schneider, B. (2017). Guiding Principles for Evaluating Evidence in Education Research. In: Eryaman, M., Schneider, B. (eds) Evidence and Public Good in Educational Policy, Research and Practice. Educational Governance Research, vol 6. Springer, Cham. https://doi.org/10.1007/978-3-319-58850-6_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-58850-6_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58849-0
Online ISBN: 978-3-319-58850-6
eBook Packages: EducationEducation (R0)