, Volume 36, Issue 4, pp 497–516 | Cite as

Impact of SES on Estonian Students’ Science Achievement Across Different Cognitive Domains

  • Kristi Mere
  • Priit Reiska
  • Thomas M. Smith
School Quality and Equity in Central and Eastern Europe


Science Teacher Conceptual Understanding Science Achievement Factual Knowledge Estimate Item Response Theory Parameter 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Arbinger R. (1991) Wissensdiagnostik [Diagnosis of knowledge]. In Ingenkamp K., Jäger R.S. (eds) Tests und Trends, 9. Jahrbuch der pädagogischen Diagnostik. Weinheim, Basel: Beltz, pp. 80–108Google Scholar
  2. Bulach C., Malone B., Castleman C. (1995) An investigation of variables related to student achievement. Mid-Western educational researcher 8(2):23–29Google Scholar
  3. Cohen J., Cohen P. (1983) Applied multiple regression/correlation analysis for the behavioral sciences, 2nd ed. Hillsdale, NJ: ErlbaumGoogle Scholar
  4. Darling-Hammond, L. 2000. Teacher quality and student achievement: a review of state policy evidence. Education policy analysis archives, vol. 8, no. 1. <>Google Scholar
  5. Estonia. Ministry of Education. 2001. Estonian education strategy “Learning Estonia”. Tallinn: Ministry of EducationGoogle Scholar
  6. Estonia. Ministry of Education; Ministry of Economic Affairs. 2001. Knowledge-based Estonia: Estonian research and development strategy 2002–2006. Tallinn: Ministry of EducationGoogle Scholar
  7. Fowler W.J. Jr., Walberg H.J. (1991) School size, characteristics and outcomes. Educational evaluation and policy analysis 13(2):189–202CrossRefGoogle Scholar
  8. Hamilton L., et al. (2003) Studying large-scale reforms of instructional practice: an example from mathematics and science. Educational evaluation and policy analysis 25(1):1–29Google Scholar
  9. Heck R. (2000) Examining the impact of school quality on school outcomes and improvement: a value-added approach. Educational administration quarterly 36(4):513–552CrossRefGoogle Scholar
  10. Loucks-Horsley S., et al. (1998) Designing professional development for teachers of science and mathematics. Thousand Oaks, CA: Corwin PressGoogle Scholar
  11. Ma L. (1999) Knowing and teaching elementary mathematics: teachers’ understanding of fundamental mathematics in China and the United States. Mahwah, NJ: ErlbaumGoogle Scholar
  12. Martin, M.O.; Mullis, I.V.S.; Chrostowski, S.J., eds. 2004a. TIMSS 2003 technical report. Chestnut Hill, MA: Boston CollegeGoogle Scholar
  13. Martin M.O., et al. (2004b) TIMSS 2003 International Science Report: findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades. Chestnut Hill, MA: Boston CollegeGoogle Scholar
  14. Micklewright J. (1999) Education, inequality and transition. Economics of transition 7(2):343–376CrossRefGoogle Scholar
  15. Mullis, I.V.S.; Martin, M.O.; Smith, T.A., eds. 2003. TIMSS assessment frameworks and specifications, 2nd ed. Chestnut Hill, MA: Boston CollegeGoogle Scholar
  16. Murnane R.J., Phillips B.R. (1981) Learning by doing, vintage, and selection: three pieces of the puzzle relating teaching experience and teaching performance. Economics of education review 1(4):453–465CrossRefGoogle Scholar
  17. Organisation for Economic Co-operation and Development. 2001. Reviews of national policies for education. Estonia. Paris: OECDGoogle Scholar
  18. Raudenbush S.W., Bryk A.S. (2002) Hierarchical linear models, 2nd ed. Newbury Park, CA: SageGoogle Scholar
  19. Reiska, P. 1999. Physiklernen und Handeln von Schülerrn in Estland und in Deutschland. Eine empirische Untersuchung zu zwei unterschiedlichen Unterrichtskonzepten im Bereich von Energie und Energieversorgung mit den Methoden Concept Mapping und Computersimulation. Kiel: Universität Kiel. [Dissertation.]Google Scholar
  20. Rivkin, S.G.; Hanushek, E.A.; Kain, J.F. 1998. Teachers, schools and academic achievement. Cambridge, MA: National Bureau of Economic Research. (Working Paper, 6691.)Google Scholar
  21. Tarter C., Bliss F., Hoy W. (1989) School characteristics and faculty trust in secondary schools. Educational administration quarterly l25(3):294–309CrossRefGoogle Scholar
  22. Tergan S.O. (1986) Modelle der Wissensrepräsentation als Grundlage qualitativer Wissensdiagnostik. Opladen: Westdeutscher Verlag GmbHGoogle Scholar
  23. United States Department of Education. 2001. Monitoring school quality: an indicators report. Washington, DC: U.S. Government Printing Office. (Written by Mayer, D.P.; Mullens, J.E.; Moore, M.T.) (NCES 2001–30.)Google Scholar
  24. Von Secker C. (2002) Effects of inquiry-based teacher practices on science excellence and equity. Journal of educational research 95(3):151–160CrossRefGoogle Scholar
  25. Von Secker C., Lissitz R. (1999) Estimating the impact of instructional practices on student achievement in science. Journal of research in science teacher 36(10):1110–1126CrossRefGoogle Scholar
  26. Weaver G. (1997) Strategies in k-12 science instruction to promote conceptual change. Hoboken, NJ: John WileyGoogle Scholar
  27. Willms, J.D.; Smith, T.M. 2005. A manual for conducting analyses with data from TIMSS and PISA. Montreal: UIS. (Report prepared for the UNESCO Institute for Statistics.)Google Scholar
  28. Willms J.D. (2003) Ten hypotheses about socioeconomic gradients and community differences in children’s developmental outcomes. Ottawa: Applied Research Branch of Human Resources Development CanadaGoogle Scholar

Copyright information


Authors and Affiliations

  1. 1.Chief Inspector of the Monitoring DepartmentMinistry of Education and ResearchTartu EstoniaUSA
  2. 2.Faculty of Educational SciencesTallinn Pedagogical UniversityTallinnEstonia
  3. 3.Department of Leadership, Policy, & Organizations, # 514 Peabody College, 230 Appleton PlaceVanderbilt UniversityNashvilleUSA

Personalised recommendations