Advertisement

Review of Research into Misconceptions and Misunderstandings in Physics and Mathematics

  • Teresa NeidorfEmail author
  • Alka Arora
  • Ebru Erberber
  • Yemurai Tsokodayi
  • Thanh Mai
Open Access
Chapter
Part of the IEA Research for Education book series (IEAR, volume 9)

Abstract

Many diagnostic methods have been used to analyze data from large-scale assessments such as the Trends in International Mathematics and Science Study (TIMSS), and the prior research on misconceptions and misunderstandings in physics and mathematics is extensive. This literature review provides an overview of different diagnostic models that have been used to explore student attributes and misconceptions in mathematics and science and how they compare to the methodology used in this study. A comprehensive review of prior research into student misconceptions and misunderstandings in physics related to gravitational force and in mathematics related to linear equations connects the established literature to the current study.

Keywords

Diagnostic models Errors Gravity International large-scale assessment Linear equations Mathematics Misconceptions Physics Science Student achievement Trend analysis Trends in International Mathematics and Science Study (TIMSS) 

2.1 Introduction

When measuring student achievement, traditional methods of analysis often focus on what students know (i.e., the correct answers). For example, large-scale assessments such as IEA’s TIMSS use unidimensional models such as item response theory (IRT) to measure individual students’ latent abilities, skills, and knowledge. Recent research using multidimensional models has begun to consider both correct and incorrect patterns when measuring and reporting on specific skills/abilities and misconceptions. Prior research has highlighted the importance of identifying and understanding student misconceptions to improve learning in both physics and mathematics.

We divide the literature review into three sections. The first section reviews the variety of diagnostic models that have been used to explore student attributes and misconceptions, misunderstandings, and errors in mathematics and science. The second and third sections explore prior research into student misconceptions, misunderstandings, and errors in physics related to gravitational force, and in mathematics related to linear equations, respectively. Both sections also look at gender differences in the prevalence of misconceptions.

2.2 Diagnostic Models Overview

Traditional psychometric models used for test analysis, such as IRT models, often focus on measuring a single latent continuum representing overall ability (Bradshaw and Templin 2014). Although these models are considered an important means of assessing student knowledge, their focus on measuring one underlying student ability is limiting. De la Torre and Minchen (2014) noted that the unidimensional nature of these methods made them less effective as diagnostic models. The need for models that would provide diagnostic information spurred the development of a new class of test models known as cognitive diagnostic models (CDMs).

A CDM is a type of model that classifies different combinations of mastered student attributes into different latent classes. It then determines students’ abilities based on various skills or attributes that students have or have not mastered (de la Torre and Minchen 2014; Henson et al. 2009). An example of a CDM model is the diagnostic classification model (DCM), which uses distractor-driven tests (designed to measure both “desirable and problematic aspects of student reasoning”) or multiple-choice tests that measure multidimensional attributes (Shear and Roussos 2017). In addition to the DCM, there are many other types of CDMs, such as the rule space model (Tatsuoka 1983), the deterministic input, noisy “and” gate (DINA) model (Junker and Sijtsma 2001), the noisy input, deterministic “and” gate (NIDA) model (Maris 1999), and the reparametrized unified model (RUM) (Roussos et al. 2007). Each of these models vary in terms of their complexity, the parameters they assign to each item, and the assumptions made when random noise enters the test-taking process (Huebner and Wang 2011). The varied and multidimensional nature of CDMs makes them better suited to performing educational diagnoses. In fact, a recent study by Yamaguchi and Okada (2018) using TIMSS 2007 mathematics data found that CDMs had a better fit than IRT models.

A relatively new approach, the scaling individuals and classifying misconceptions (SICM) model, investigated by Bradshaw and Templin (2014), combines the IRT model and the DCM to provide a statistical tool to measure misconceptions. The SICM model uses data on wrong answers by modeling categorical latent variables that represent “misconceptions” instead of skills. To categorize misconceptions, the authors cited inventories such as the force concept inventory (Hestenes et al. 1992), an assessment of the Newtonian concept of force.

For large-scale assessments, such as TIMSS, applying these current diagnostic models can be difficult since the TIMSS assessments were not designed as cognitive diagnostic assessments that measure specific components of skills/abilities, nor were they designed using a CDM with pre-defined attributes (de la Torre and Minchen 2014; Leighton and Gierl 2007). However, some studies have shown that applying these approaches to TIMSS data can provide valuable information about test takers. Dogan and Tatsuoka (2008) used the rule space model to evaluate Turkish performance on the TIMSS 1999 grade eight mathematics assessment (also known as the Third International Mathematics and Science Study-Repeat, or TIMSS-R), determining that Turkish students demonstrated weaknesses in skills such as applying rules in algebra and quantitative reading. Another study (Choi et al. 2015) also used a CDM approach to compare performance on the TIMSS mathematics assessment between the United States and Korean grade eight samples. While these studies showed that CDM can offer valuable information on student concept mastery in TIMSS, these studies also acknowledged there are limitations when applying these models to this assessment.

In general, CDMs and SICMs use best-fit models to predict student-level proficiency and misconceptions, and these models would be most efficient when used on computer adaptive tests (CATs), so that “all test takers can be measured with the same degree of precision” (Hsu et al. 2013). The TIMSS assessments, which are not designed for student-level reporting and are not computer-adaptive, are not catered to CDMs and SICMs. Based on the TIMSS assessment design, only a portion of the items are administered to each student; thus, the claims that can be made about student proficiency on specific skills and concepts are limited.1

In contrast to research using the types of diagnostic models described above, our study used a different diagnostic approach based on item-level performance data (i.e., frequency distributions across response categories) for individual assessment items to explore the nature and extent of students’ misconceptions, errors, and misunderstandings demonstrated by their incorrect responses. Other studies conducted by countries participating in TIMSS have taken a similar approach to describing student understanding and misconceptions based on their responses to individual TIMSS and TIMSS Advanced mathematics and science items at different grade levels (Angell 2004; Juan et al. 2017; Mosimege et al. 2017; Prinsloo et al. 2017; Provasnik et al. 2019; Saputro et al. 2018; Văcăreţu, n.d.; Yung 2006). For example, Angell (2004) analyzed student performance on TIMSS Advanced 1995 physics items in Norway; a series of diagnostic reports published in South Africa used item-level data from TIMSS 2015 to describe performance of their students in mathematics for grade five (Juan et al. 2017) and grade nine (Mosimege et al. 2017), and in science for grade nine (Prinsloo et al. 2017); and Saputro et al. (2018) used performance on algebra items from TIMSS 2011 to understand the types of errors made by students in Indonesia. All of these reports presented released items from TIMSS and TIMSS Advanced and described common types of incorrect answers given by students on the assessments, finding that misconceptions were often context-dependent and could be missed in broader analyses.

Our study goes beyond looking at individual assessment items by focusing on sets of items that measure specific concepts of interest in physics and mathematics across grade levels (gravity and linear equations, in this case). Student performance on these items are used to report on patterns in misconceptions across countries, grades, and assessment cycles, and by gender. Considering the assessment design of TIMSS, there is unique value in this approach to focus on item-level data to make country-level inferences and better understand how student misconceptions have changed over time in different cultural contexts.

2.3 Misconceptions in Physics

Physics misconceptions (including those related to gravity) held by students of varying ages have been studied extensively. Previous research has included investigations of primary, secondary, and university students (Darling 2012; Demirci 2005; Hestenes et al. 1992; Pablico 2010; Piburn et al. 1988; Stein et al. 2008), as well as pre-service teachers (Gӧnen 2008). The literature about misconceptions related to gravitational force demonstrates that alternate conceptions of physical observations and processes based on intuition or preconceived notions are common and pervasive.

When analyzing misconceptions in physics, many researchers have focused on “common sense beliefs,” a “system of beliefs and intuitions about physical phenomena derived from extensive personal experience” that students may develop before they even enter the classroom (Halloun and Hestenes 1985a, b). Many of these beliefs are misconceptions inconsistent with scientific explanations provided during formal instruction; moreover, they are difficult to overcome and can inhibit students from understanding and applying more advanced physics concepts if not addressed early on. Numerous studies have been conducted to further explain these misunderstandings and several diagnostic tests have been developed to measure them, the most widely used being the force concept inventory, which uses multiple-choice items to track student misconceptions relating to “common sense beliefs” (Hestenes et al. 1992). Research has shown that many physics misconceptions are best overcome by focused instruction that actively aims to address these misconceptions (Eryilmaz 2002; Hestenes et al. 1992; Thornton et al. 2009).

Misconceptions based on common-sense beliefs tend to be incompatible with many physics concepts, such as Newton’s laws. For example, several studies have documented that students believe that there is always a force in the direction of motion and that this belief sometimes prevails even after college instruction (Clement 1982; Hestenes et al. 1992; Thornton and Sokoloff 1998). Another well-documented misconception is that it is not possible to have acceleration without velocity (Kim and Pak 2002; Reif and Allen 1992). These misconceptions can often stem from students’ inability to distinguish between velocity, acceleration, and force (Reif and Allen 1992; Trowbridge and McDermott 1980). In particular, many students struggle with gravitational force. The concept appears to be poorly learned at the secondary level, with related misconceptions continuing in higher levels of education (Bar et al. 2016; Kavanaugh and Sneider 2007).

In addition, many students’ conceptions of gravity are closely related to their conceptions of a spherical Earth (Gönen 2008; Nussbaum 1979; Sneider and Pulos 1983). When conducting interviews with children in grades six and 10 on what objects presented to them were acted on by gravity, Palmer (2001) found that <30% of students in each grade level were able to correctly answer that all of the objects were acted on by gravity. Some students, Palmer noted, also believed that buried objects (beneath the surface of Earth) were not subject to gravity.

Many of these misconceptions have been shown to be stable in the face of conventional physics instruction, preventing students from learning new concepts. One previous study on misconceptions about force and gravity investigated high school students’ conceptions about the direction of motion and force on a ball being thrown upward and then falling back down (Pablico 2010). The majority of students in the study (grades 9–12) demonstrated the misconception that the net force on the ball was always in the direction of motion throughout the ball’s path, not understanding that it is the constant downward force due to gravity that causes the observed changes in motion. Many students thought that the force was directed upward during the ball’s upward motion and that the force was zero when the ball was at the top of its flight (when it stops momentarily and changes direction). Although students identified the force as downward when the ball was traveling down, most were not able to correctly justify this answer, with many students believing that the force must be directed down since the ball is moving downward.

Other research has described instances of gender gaps in students’ understanding in physics. For example, at the beginning of physics courses, females tend to start with lower levels of conceptual understanding, and conventional instructional approaches are not effective in shrinking this gender gap (Cavallo et al. 2004; Docktor and Heller 2008; Hake 2002; Hazari et al. 2007; Kost et al. 2009).

2.4 Misunderstandings in Mathematics

In mathematics, algebra is often considered a gatekeeper to higher education and related career paths (Kilpatrick and Izsák 2008). Although algebraic understanding is considered crucial for student success in more advanced mathematics courses, many scholars have documented that students struggle with algebraic concepts, especially those relating to linear equations.

Solving linear equations requires a balance of conceptual knowledge and procedural skills. Conceptual knowledge involves having an understanding of principles and relationships, while procedural skills involve the ability to carry out a sequence of operations effectively (Gilmore et al. 2017). Unlike simpler arithmetic problems, solving linear equations involves much more than merely memorizing and applying a formula to solve an equation; it also includes understanding the relationship between the quantities represented. Conceptually, students need a deep understanding of independent and dependent variables to explain what slope or intercepts mean in a given situation (Kalchman and Koedinger 2005). Yet many students have shown a tendency to rely on procedural knowledge despite lacking a conceptual understanding of the equation (Caglayan and Olive 2010).

Stump (2001) argued that although high school pre-calculus students have been exposed to formal instruction, their conceptual understanding of “slope” is not well developed. When testing a group of high schoolers in her study, Stump found that many students understood slope in functional situations but were unable to recognize it as a measure of rate of change or as a measure of steepness. Other researchers noted that while gaining an understanding of slope, students they interviewed were unable to recognize the difference between additive and multiplicative relationships (Simon and Blume 1994) or were unable to understand ratio as a measure of slope (Swafford and Langrall 2000). This inability to develop a conceptual knowledge of the relationship between variables has contributed to many misunderstandings related to slope and linear equations.

Lack of conceptual knowledge about the relationship between variables in linear equations also impacts a student’s ability to understand and translate the symbolic nature of linear equations. Official standards, such as those of the National Council of Teachers of Mathematics (NCTM), recommend that students must be able to “represent and analyze relationships using tables, verbal rules, equations, and graphs” (NCTM 1989). Yet many students find it very difficult to represent equations graphically. Research suggests that this is because students tend to lack a strong understanding of the relationship between algebraic equations and graphical representations (Knuth 2000).

Even when using a graphical approach would ensure a higher likelihood of success, researchers have found that students were reluctant to use graphs (Knuth 2000; Tsamir and Almog 2001; Dyke and White 2004). For example, Knuth (2000) found that even when working on problems designed to encourage the use of graphical reasoning, students demonstrated a strong reliance on other solution methods and failed to use graphical-solution methods. In another study, Huntley et al. (2007) conducted clinical interviews of third year high school mathematic students and found that many students needed to be prompted to use graphical solutions even it was the most efficient method to solve the equation.

This difficulty with modeling algebraic relationships graphically makes it difficult for students to translate real life word problems into the appropriate algebraic equations (Adu et al. 2015; Bishop et al. 2008). Without focused and deliberate instruction, it would be difficult for students to tackle these algebraic misunderstandings as they progress to higher levels of mathematics. As noted in the physics section, some research in this area has found that males make fewer mistakes than females and make different types of mistakes when solving problems related to multi-step linear equations in algebra (Powell 2013).

This report contributes to the literature on research into students’ misconceptions and misunderstandings in physics and mathematics by studying specific types of related misconceptions, errors, and misunderstandings about gravity and linear equations across grade levels and reporting patterns in these across countries and by gender. The results reinforce the importance of identifying and understanding students’ misconceptions, errors, and misunderstandings to determine what changes may be needed in the curricula through secondary school to improve student learning and to ensure their readiness for post-secondary education and/or future careers.

Footnotes

  1. 1.

    TIMSS uses a matrix-sampling design whereby a student is administered only a sample of the assessment items; most items are missing by design for each student.

References

  1. Adu, E., Assuah, C. K., & Asiedu-Addo, S. K. (2015). Students’ errors in solving linear equation word problems: Case study of a Ghanaian senior high school. African Journal of Educational Studies in Mathematics and Sciences, 11, 17–30.Google Scholar
  2. Angell, C. (2004). Exploring students’ intuitive ideas based on physics items in TIMSS-1995. In C. Papanastasiou (Ed.), Proceedings of the IRC-2004 TIMSS. IEA International Research Conference (Vol. 2, pp. 108–123). Published Nicosia, Cyprus: University of Cyprus. Retrieved from https://www.iea.nl/sites/default/files/2019-03/IRC2004_Angell.pdf.
  3. Bar, V., Brosh, Y., & Sneider, C. (2016). Weight, mass, and gravity: Threshold concepts in learning science. Science Educator, 25(1), 22–34.Google Scholar
  4. Bishop, A., Filloy, E., & Puig, L. (2008). Educational algebra: A theoretical and empirical approach. Boston, MA, USA: Springer.Google Scholar
  5. Bradshaw, L., & Templin, J. (2014). Combining item response theory and diagnostic classification models: A psychometric model for scaling. Psychometrika, 79(3), 403–425.CrossRefGoogle Scholar
  6. Caglayan, G., & Olive, J. (2010). Eighth grade students’ representations of linear equations based on a cups and tiles model. Educational Studies in Mathematics, 74(2), 143–162.CrossRefGoogle Scholar
  7. Cavallo, A. M., Potter, W. H., & Rozman, M. (2004). Gender differences in learning constructs, shifts in learning constructs, and their relationship to course achievement in a structured inquiry, yearlong college physics course for life science majors. School Science and Mathematics, 104, 288–300.CrossRefGoogle Scholar
  8. Choi, K. M., Lee, Y. S., & Park, Y. S. (2015). What CDM can tell about what students have learned: An analysis of TIMSS eighth grade mathematics. Eurasia Journal of Mathematics, Science and Technology Education, 11(6), 1563–1577.Google Scholar
  9. Clement, J. (1982). Students’ preconceptions in introductory mechanics. American Journal of Physics, 50(1), 66–71.CrossRefGoogle Scholar
  10. Darling, G. (2012). How does force affect motion? Science and Children, 50(2), 50–53.Google Scholar
  11. De la Torre, J., & Minchen, N. (2014). Cognitively diagnostic assessments and the cognitive diagnosis model framework. Psicología Educativa, 20(2), 89–97.CrossRefGoogle Scholar
  12. Demirci, N. (2005). A study about students’ misconceptions in force and motion concepts by incorporating a web-assisted physics program. Turkish Online Journal of Educational Technology, 4(3), 40–48.Google Scholar
  13. Docktor, J., & Heller, K. (2008). Gender differences in both force concept inventory and introductory physics performance. AIP Conference Proceedings, 1064, 15–18. Retrieved from  https://doi.org/10.1063/1.3021243.
  14. Dogan, E., & Tatsuoka, K. (2008). An international comparison using a diagnostic testing model: Turkish students’ profile of mathematical skills on TIMSS-R. Educational Studies in Mathematics, 68(3), 263–272.CrossRefGoogle Scholar
  15. Dyke, F. V., & White, A. (2004). Examining students’ reluctance to use graphs. Mathematics Teacher, 98(2), 110–117.Google Scholar
  16. Eryilmaz, A. (2002). Effects of conceptual assignments and conceptual change discussions on students’ misconceptions and achievement regarding force and motion. Journal of Research in Science Teaching, 39(1), 1001–1015.CrossRefGoogle Scholar
  17. Gilmore, C., Keeble, S., Richardson, S., & Cragg, L. (2017). The interaction of procedural skill, conceptual understanding and working memory in early mathematics achievement. Journal of Numerical Cognition, 3(2), 400–416.CrossRefGoogle Scholar
  18. Gönen, S. (2008). A study on student teachers’ misconceptions and scientifically acceptable conceptions about mass and gravity. Journal of Science Education and Technology, 17(1), 70–81.CrossRefGoogle Scholar
  19. Hake, R. R. (2002). Relationship of individual student normalized learning gains in mechanics with gender, high-school physics, and pretest scores on mathematics and spatial visualization. Physics Education Research Conference, Boise, Idaho, USA, August 2002. Retrieved from https://www.researchgate.net/publication/237457456_Relationship_of_Individual_Student_Normalized_Learning_Gains_in_Mechanics_with_Gender_High-School_Physics_and_Pretest_Scores_on_Mathematics_and_Spatial_Visualization.
  20. Halloun, I. A., & Hestenes, D. (1985a). Common sense concepts about motion. American Journal of Physics, 53(11), 1056–1065.CrossRefGoogle Scholar
  21. Halloun, I. A., & Hestenes, D. (1985b). The initial knowledge state of college physics students. American Journal of Physics, 53(11), 1043–1048.CrossRefGoogle Scholar
  22. Hazari, Z., Tai, R. H., & Sadler, P. M. (2007). Gender differences in introductory university physics performance: The influence of high school physics preparation and affective factors. Science Education, 91, 847–876. Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1002/sce.20223.
  23. Henson, A. H., Templin, J. L., & Willse, J. T. (2009). Defining a family of cognitive diagnosis models using log-linear models with latent variables. Psychometrika, 74(2), 191–210.CrossRefGoogle Scholar
  24. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141–158.CrossRefGoogle Scholar
  25. Hsu, C. L., Wang, W. C., & Chen, S. Y. (2013). Variable-length computerized adaptive testing based on cognitive diagnosis models. Applied Psychological Measurement, 37(7), 563–582.CrossRefGoogle Scholar
  26. Huebner, A., & Wang, C. (2011). A note on comparing examinee classification methods for cognitive diagnosis models. Educational and Psychological Measurement, 71(2), 407–419.CrossRefGoogle Scholar
  27. Huntley, M. A., Marcus, R., Kahan, J., & Miller, J. L. (2007). Investigating high-school students’ reasoning strategies when they solve linear equations. Journal of Mathematical Behavior, 26, 115–139.CrossRefGoogle Scholar
  28. Juan, A., Hannan, S., Zulu, N., Harvey, J. C., Prinsloo, C. H., Mosimege, M., & Beku, U. (2017). TIMSS item diagnostic report: South Africa: Grade 5 Numeracy. (Commissioned by the Department of Basic Education). South Africa: Human Sciences Research Council. Retrieved from http://ecommons.hsrc.ac.za/handle/20.500.11910/11447.
  29. Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25(3), 258–272.CrossRefGoogle Scholar
  30. Kalchman, M., & Koedinger, K. R. (2005). Teaching and learning functions. In M. S. Donovan, & J. D. Bransford (Eds.), How students learn: History, mathematics, and science in the classroom (pp. 351–393). Washington, DC: National Academies Press.Google Scholar
  31. Kavanagh, C., & Sneider, C. (2007). Learning about gravity I. Free fall: A guide for teachers and curriculum developers. Astronomy Education Review, 5(21), 21–52.Google Scholar
  32. Kilpatrick, J., & Izsák, A. (2008). A history of algebra in the school curriculum. Algebra and Algebraic Thinking in School Mathematics, 70, 3–18.Google Scholar
  33. Kim, E., & Pak, S. J. (2002). Students do not overcome conceptual difficulties after solving 1000 traditional problems. American Journal of Physics, 70(7), 759–765.CrossRefGoogle Scholar
  34. Knuth, E. J. (2000). Student understanding of the Cartesian connection: An exploratory study. Journal for Research in Mathematics Education, 31(4), 500–508.CrossRefGoogle Scholar
  35. Kost, L. E., Pollock, S. J., & Finkelstein, N. D. (2009). Unpacking gender differences in students’ perceived experiences in introductory physics. AIP Conference Proceedings, 1179, 177–180. Retrieved from  https://doi.org/10.1063/1.3266708.
  36. Leighton, J. P., & Gierl, M. J. (2007). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees’ thinking processes. Educational Measurement: Issues and Practice, 26(2), 3–16.CrossRefGoogle Scholar
  37. Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64, 187–212.CrossRefGoogle Scholar
  38. Mosimege, M., Beku, U., Juan, A., Hannan, S., Prinsloo, C. H., Harvey, J. C., & Zulu, N. (2017). TIMSS item diagnostic report: South Africa: Grade 9 Mathematics. (Commissioned by the Department of Basic Education). South Africa: Human Sciences Research Council. Retrieved from http://repository.hsrc.ac.za/handle/20.500.11910/11448.
  39. NCTM. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  40. Nussbaum, J. (1979). Children’s conceptions of the earth as a cosmic body: A cross age study. Science Education, 63(1), 83–93.CrossRefGoogle Scholar
  41. Pablico, J. R. (2010). Misconceptions on force and gravity among high school students. Louisiana State University Master’s theses 2462. Retrieved from https://digitalcommons.lsu.edu/gradschool_theses/2462/.
  42. Palmer, D. (2001). Students’ alternative conceptions and scientifically acceptable conceptions about gravity. International Journal of Science Education, 23(7), 691–706.CrossRefGoogle Scholar
  43. Piburn, M. D., Baker, D. R., & Treagust, D. F. (1988). Misconceptions about gravity held by college students. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Lake of the Ozarks, MO, USA, April 10–13, 1988. Retrieved from https://eric.ed.gov/?id=ED292616.
  44. Powell, A. N. (2013). A study of middle school and college students’ misconceptions about solving multi-step linear equations. Masters thesis, State University of New York at Fredonia, NY, USA. Retrieved from http://hdl.handle.net/1951/58371.
  45. Prinsloo, C. H., Harvey, J. C., Mosimege, M., Beku, U., Juan, A., Hannan, S., & Zulu, N. (2017). TIMSS item diagnostic report: South Africa: Grade 9 Science. (Commissioned by the Department of Basic Education). South Africa: Human Sciences Research Council. Retrieved from http://www.hsrc.ac.za/en/research-data/ktree-doc/19286.
  46. Provasnik, S., Malley, L., Neidorf, T, Arora, A., Stephens, M., Balestreri, K., Perkins, R., & Tang, J. H. (2019, in press). U.S. performance on the 2015 TIMSS Advanced mathematics and physics assessments: A closer look (NCES 2017-020). Washington, DC: US Department of Education, National Center for Education Statistics.Google Scholar
  47. Reif, F., & Allen, S. (1992). Cognition for interpreting scientific concepts: A study of acceleration. Cognition and Instruction, 9(1), 1–44.CrossRefGoogle Scholar
  48. Roussos, L. A., Templin, J. L., & Henson, R. A. (2007). Skills diagnosis using IRT-based latent class models. Journal of Educational Measurement, 44(4), 293–311.CrossRefGoogle Scholar
  49. Saputro, B. A., Suryadi, D., Rosjanuardi, R., & Kartasasmita, B. G. (2018). Analysis of students’ errors in responding to TIMSS domain algebra problem. Journal of Physics: Conference Series, 1088, 012031. Retrieved from https://iopscience.iop.org/article/10.1088/1742-6596/1088/1/012031.
  50. Shear, B. R., & Roussos, L. A. (2017). Validating a distractor-driven geometry test using a generalized diagnostic classification model. In B. Zumbo, & A. Hubley (Eds.), Understanding and investigating response processes in validation research (Vol. 69, pp. 277–304). Cham, Switzerland: Springer.CrossRefGoogle Scholar
  51. Simon, M. A., & Blume, G. W. (1994). Mathematical modeling as a component of understanding ratio-as-measure: A study of prospective elementary teachers. Journal of Mathematical Behavior, 13, 183–197.CrossRefGoogle Scholar
  52. Sneider, G., & Pulos, S. (1983). Children’s cosmographies: Understanding the earth’s shape and gravity. Science Education, 67(2), 205–221.CrossRefGoogle Scholar
  53. Stein, M., Larrabee, T. G., & Barman, C. R. (2008). A study of common beliefs and misconceptions in physical science. Journal of Elementary Science Education, 20(2), 1–11.CrossRefGoogle Scholar
  54. Stump, S. L. (2001). High school precalculus students’ understanding of slope as measure. School Science and Mathematics, 101(2), 81–89.CrossRefGoogle Scholar
  55. Swafford, J. O., & Langrall, C. W. (2000). Grade 6 students’ preinstructional use of equations to describe and represent problem situations. Journal for Research in Mathematics Education, 31(1), 89–112.CrossRefGoogle Scholar
  56. Tatsuoka, K. K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20(4), 345–354.CrossRefGoogle Scholar
  57. Thornton, R. K., Kuhl, D., Cummings, K., & Marx, J. (2009). Comparing the force and motion conceptual evaluation and the force concept inventory. Physical Review Special Topics: Physics Education Research, 5(1), 1–8.Google Scholar
  58. Thornton, R. K., & Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66(4), 338–352.CrossRefGoogle Scholar
  59. Trowbridge, D. E., & McDermott, L. C. (1980). Investigation of student understanding of the concept of velocity in one dimension. American Journal of Physics, 48(12), 1020–1028.CrossRefGoogle Scholar
  60. Tsamir, P., & Almog, N. (2001). Students’ strategies and difficulties: the case of algebraic inequalities. International Journal of Mathematical Education in Science and Technology, 32(4), 513–524.CrossRefGoogle Scholar
  61. Văcăreţu, A. (n.d.). Using the TIMSS results for improving mathematics learning. Cluj-Napoca, Romania: Romanian Reading and Writing for Critical Thinking Association. Retrieved from http://directorymathsed.net/montenegro/Vacaretu.pdf.
  62. Yamaguchi, K., & Okada, K. (2018). Comparison among cognitive diagnostic models for the TIMSS 2007 fourth grade mathematics assessment. PLoS ONE, 13(2), e0188691.CrossRefGoogle Scholar
  63. Yung, B. H. W. (Ed.). (2006). Learning from TIMSS: Implications for teaching and learning science at the junior secondary level. TIMSS HK IEA Centre. Hong Kong: Education and Manpower Bureau. Retrieved from https://cd1.edb.hkedcity.net/cd/science/is/timss/learningfromtimss.pdf.

Copyright information

© International Association for the Evaluation of Educational Achievement (IEA) 2020

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  • Teresa Neidorf
    • 1
    Email author
  • Alka Arora
    • 1
  • Ebru Erberber
    • 1
  • Yemurai Tsokodayi
    • 1
  • Thanh Mai
    • 1
  1. 1.American Institutes for ResearchWashingtonUSA

Personalised recommendations