Keywords

5.1 Summary of the Findings

Although the association between family SES*Footnote 1 and student achievement has been thoroughly investigated in previous research (see Chap. 2), the extent of change in that association in individual education systems over time is less well known. Improving achievement among their disadvantaged students and narrowing the achievement gaps between students of low- and high-SES backgrounds is a common policy goal for many education systems. However, the lack of quantifiable measures, especially those that are easy to understand, makes it difficult to track and assess the effect of such efforts.

Twenty years of TIMSS data, from 1995 to 2015, provide researchers with a means to empirically address important research questions regarding changes in educational inequality over time. We used the TIMSS data to examine whether the inequality of educational outcomes due to SES* has changed for education systems over time and to investigate the extent to which disadvantaged students improved their academic performance over time in each education system.

Our first research question was: “How has the inequality of education outcomes due to family socioeconomic status changed for different education systems between 1995 and 2015?” We created a modified version of the TIMSS home educational resources (HER) index that was consistent over the 20-year period to define low- and high-SES* groups of students. For each educational system and assessment cycle, we calculated the achievement gap between students in these low- and high-SES* quartile groups. When examining achievement gaps in either mathematics or science between 1995 and 2015, our results suggested that Hungary, Iran, Lithuania, and Singapore experienced a significantly widening gap between low- and high-SES* students, while Norway, Slovenia, and the United States observed a significantly narrowing gap. By contrast, some educational systems observed some significant changes in one of the two decades of TIMSS (1995–2003 and 2003–2015), but not in the other. For example, Australia experienced a significant decrease in the SES* achievement gap for science between 1995 and 2003, and then a significant increase between 2003 and 2015, resulting in an overall non-significant trend over the 20-year period. Similarly, New Zealand experienced some decrease in the SES* gaps in the first decade of TIMSS, but this was followed by a significant increase in the second decade. There are many other examples where the more detailed study of trends broken down into the 1995–2003 and 2003–2015 time periods reveals interesting countervailing trends that warrant a closer look by researchers with a deep understanding of the local contexts.

Our second research question was: “To what extent have education systems managed to increase the academic performance of disadvantaged students between 1995 and 2015?” To answer this, we calculated the percentage of low-SES* students who performed at or above the TIMSS intermediate benchmark in each education system over time. It was of great importance to examine this question in conjunction with the first question because stagnant scores for low-SES* students and declines in the scores of high-SES* students are equally undesirable, but may also underlie a headline reduction in inequality. For example, in the Republic of Korea, the achievement gap in mathematics was 107 points in 2011 but declined to 84 points in 2015; this was not due to an improvement for low-SES* students but rather a decline in the performance of their high-SES* students (Fig. 4.13). In contrast, the United States showed a decreasing achievement gap for science between 1995 and 2015, which corresponded to a continuous improvement in the performance of their low-SES* students (Fig. 4.30). Ideally, education systems should strive for equality by improving the performance of all students and by aiming to improve the achievement of low-SES* students at a faster rate to reduce gaps in achievement (Mullis et al. 2016).

5.2 Relating the Findings to Country-Level Indicators in the Educational Systems and the Macroeconomic Context

To better understand our findings in the larger context in which education systems operate, we obtained macroeconomic and other indicators from the TIMSS encyclopedias, as well as data from external resources (see Table 3.6 for all sources). Our goal was to explore changes in country level indicators over time and contrast them with changes in the SES* achievement gap (Table 5.1). A few tentative patterns emerged, which merit further investigation.

Table 5.1 Change in achievement gap and macro-economic characteristics: 2003–2015

5.2.1 Tentative Pattern 1: Reductions in the Achievement Gap Tend to Accompany Improvements in Overall TIMSS Performance

In the second decade of TIMSS, we identified an inverse relationship between the trends in SES* achievement gaps and the TIMSS national averages for both mathematics and science. This finding was consistent with the previous literature using other cycles of TIMSS, which suggested a prominent inverse relation between the within-country dispersion of scores and the average level of scores by country (Freeman et al. 2010; Mullis et al. 2016). In other words, greater reductions in the achievement gap between low- and high-SES* students tended to accompany higher rates of increases in overall TIMSS performance over the 2003–2015 period (see Figs. 5.1 and 5.2). This is not a trivial finding, since, as discussed previously, a continuous growth in overall performance is possible without any reduction over time in the SES* achievement gap if both low and high SES* groups see the same rate of performance growth or shrinkage over time. If that were the case for the all studied education systems, the regression line would be flat.

Fig. 5.1
A scatterplot of the difference in S E S achievement gap versus difference average mathematics performance. The Islamic Republic of Iran plots the highest in the first quadrant. A line passes through points in Lithuania and plots a decreasing trend.

Difference in average mathematics score and SES* achievement gap, by education system, 2003–2015. (Note The Islamic Republic of Iran was treated as an outlier and not included when fitting the regression line)

Fig. 5.2
A graph of the difference in the S E S achievement gap versus the difference in average science performance. A decreasing trend is plotted through the first, second, and fourth quadrants. Most countries plot values in the second quadrant. The Islamic Republic of Iran plots the highest in the first quadrant.

Difference in average science score and SES* achievement gap, by education system, 2003–2015. (Note The Islamic Republic of Iran was treated as an outlier and not included when fitting the regression line)

5.2.2 Tentative Pattern 2: Education Systems That Observed Increases in Achievement Gaps Tend to be Decentralized

While education systems that were able to reduce the SES* achievement gap could be either centralized or decentralized systems, almost all the education systems that observed increases in their SES* achievement gaps were decentralized systems, with the exception of the Islamic Republic of Iran (Table 5.1); note that this education system was also an outlier in the previous analysis.

5.2.3 Tentative Pattern 3: Education Systems That Reduced Investment in Education Tended to Observe an Increased Mathematics Achievement Gap

When examining the changes in the percentage of GDP spent on education (this indicator spanning mostly 2003–2015 figures, see Table 5.1), results suggested that those education systems that saw a reduction of investment in education over time also happened to observe a significant increase in the SES* mathematics achievement gap in the second decade of TIMSS (Fig. 5.3).

Fig. 5.3
A bar graph of the percent of G D P spent on education versus the education system. The Republic of Iran plots the highest in mathematics achievement gap change at 74 and the changes in the percent of G D P spent on education at negative 29.

Change in the percent of GDP spent on education and in the mathematics achievement gap, by education system, 2003–2015. (Notes Indicator “Percent of GDP Spent on Education” was obtained from the World Bank Open Data and the UNESCO Institute for Statistics (see Table 3.6). The calculation of change in the percent of GDP spent on education between 2003 and 2015 is the difference between the percentages in 2003 and 2015 divided by the percentage in 2003)

5.3 Limitations of the Study and Recommendations for Future Research

This study has several limitations that should be addressed by future research. The first limitation is that this study did not examine all potential factors that might explain the observed trends. Although we collected information on macro-level indicators in every corresponding education system over 20 years, we did not investigate empirically if those factors contributed to the changes in educational inequality that we observed. The broad connections made between macro-level changes and changes in educational inequality are descriptive. Future research exploring influential factors driving these changes would be important for understanding why some education systems were able to reduce the SES* achievement gaps and what can be learned by others. For example, multilevel modeling can be employed to test hypotheses regarding its potential association with macro-level factors. However, some of the factors may defy easy categorization and may be specific to individual educational systems or work only in the presence of other factors. Thus, there is also a place for a more contextual and qualitative understanding of the findings. Researchers with a deep understanding of local context would be in a better position to examine why and how these changes took place in their own educational system.

Second, measurement invariance of common items across years could be a concern for the SES* index used in this study (a modified version of the TIMSS HER index). For example, the possession of a computer in the 1990s may carry a very different meaning and value to the possession of a computer in the 2010s with respect to what it says about a student’s SES* background. We are uncertain which items see a drift in meaning, how much drift there is over time, and how such drift manifests itself in different countries. In spite of these concerns, we believe that for our study this was not a critical problem because we compared students in the highest and the lowest SES* quartile for each country and in each cycle separately. In other words, the meaning of the items or even the scale may change slightly, but this should not have had a strong influence on the comparison of the achievement gaps that were calculated based on the distribution of students’ SES* in a particular educational system and cycle. Nevertheless, future research should analyze measurement invariance of the SES* index itself or even reconstruct an item response theory scaled version of the HER index for years prior to 2011 so that analyses with that index would be possible across all TIMSS administrations.

Third, it is important to recognize that the meaning of high- and low-SES* differs by societies. We decided to use educational system specific cut-offs to define SES* groups because the current study focused on the trend in educational inequality within a society. Therefore, in interpreting comparisons between societies, it should be recognized and understood that high-SES* students in one country can be very different from high SES* students (in an absolute sense) in another country.

Finally, our analyses showed relatively distinct patterns of change in educational inequality in the first and second decades of TIMSS across countries (see Table 4.1). Future research should especially focus on the second decade of TIMSS, namely the period 2003–2015, as many significant changes in SES* achievement gaps occurred in this decade. As the number of countries participating in TIMSS has expanded since its inception, this would have the added value of allowing more educational systems to be included in the analyses. Moreover, there would be a greater availability of comparable country level macro-economic indicators if 2003 is taken as the base year instead of 1995.

5.4 What Have We Learned from Twenty Years of TIMSS Data?

Over the 20 years of TIMSS, we found that only a few educational systems were able to significantly reduce the achievement gap between high- and low-SES* students, to improve the performance of their most disadvantaged students, and to increase their national average score. Most of the education systems that we studied did not observe such a promising three-fold trend. Among the 13 education systems studied, only Slovenia observed such a trend in mathematics and only the United States for science. This further reflects the difficulty of fostering positive change in academic performance, or maintaining high performance, for all students over time, while also counteracting a general rise in inequality through policies in the education system that would enable a closing of the SES* achievement gap and effectively address the needs of disadvantaged students.

By contrast, some educational systems observed some significant changes in one of the two decades of TIMSS (1995–2003 and 2003–2015), but not the other. For example, Australia experienced a significant decrease in the SES* achievement gap for science between 1995 and 2003, followed by a significant increase between 2003 and 2015. This resulted in an overall non-significant trend over the 20-year period. There are many other examples where a more detailed study of trends broken down into the 1995–2003 and 2003–2015 time periods would be of interest. Researchers with a deep understanding of local contexts should take a closer look at such countervailing trends.

For the second decade of TIMSS (2003–2015), three tentative patterns emerged when contrasting changes in country level indicators over time and changes in the SES* achievement gaps. First, there was an inverse relationship between the changes in SES* achievement gaps and the changes in TIMSS national averages for both mathematics and science. Second, almost all the education systems with an increase in their SES* achievement gaps were categorized as “decentralized” education systems in the study. Third, the education systems with a reduction of investment in education happened to observe a significant increase in the SES* mathematics achievement gap. Although these patterns are preliminary, we encourage further investigation into the country-level changes with additional countries being included in the analyses.