The Differential Role of Socioeconomic Status in The Relationship Between Curriculum-Based Math and Math Literacy - The Link Between TIMSS and PISA PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

International Journal of Science and Mathematics Education

https://doi.org/10.1007/s10763-020-10133-2

The Differential Role of Socioeconomic Status


in the Relationship between Curriculum-Based
Mathematics and Mathematics Literacy: the Link
Between TIMSS and PISA

Hana Kang 1 & Leland Cogan 2

Received: 7 January 2020 / Accepted: 28 September 2020/


# Ministry of Science and Technology, Taiwan 2020

Abstract
Students’ socioeconomic status (SES) is not only directly related to mathematics
achievement but also indirectly related to mathematics achievement through their
unequal opportunity to learn (OTL). Despite a plethora of studies about how
mathematics achievement is associated with SES and OTL, few systematic
studies have examined the differential role SES plays in the relationship between
curriculum-based mathematics performance and application-based mathematics
literacy. To bridge the gap in the literature, this study aims at addressing the
relationship between curriculum-based mathematics and application-based math-
ematics while considering the OTL associated with students at the two different
assessment time points. Using the unique data set from the Russian Federation in
which a subset of the 2011 Trends in International Mathematics and Science
Study (TIMSS) student sample also took the Program for International Student
Assessment (PISA) assessment 1 year later, we found that lower SES students
had difficulties in transferring their curriculum-based mathematics knowledge
and skills to mathematics literacy, which plays an important role in the present
society’s high-complexity demands. These results suggest that students with a
lower SES need to receive greater attention and focus on their mathematics
learning so as to improve their capacity to interpret and evaluate real-world
problems mathematically and to make better decisions using this competency.

Keywords Mathematics education . Opportunity to learn . PISA . Socioeconomic status .


TIMSS

* Hana Kang
kang.hana800@gmail.com

Extended author information available on the last page of the article


H. Kang, L. Cogan

Introduction

In the current information society, emphasis on mathematics literacy has increased


(Yore, Pimm, & Tuan, 2007). Mathematics literacy is defined as “an individual’s
capacity to identify and understand the role that mathematics plays in the world, to
make well-founded judgments, and to engage in mathematics in ways that meet the
needs of that individual’s current and future life as a constructive, concerned and
reflective citizen” (Organisation for Economic Co-operation and Development
[OECD], 1999, p.41). Societies are increasingly complex and dynamic, requiring those
in the workforce to understand and interpret mathematical concepts and apply this
knowledge in diverse settings. Such application of mathematics literacy has become
progressively needed reflecting more than the mere knowledge of school mathematics
(English, 2002; OECD, 2013a).
Although mathematics literacy has a strong relationship with exposure to the formal
mathematics of algebra and geometry (OECD, 2013b), mathematics teachers have often
reported that students need experience with and opportunities to apply the learned mathemat-
ical concepts in different contexts in order for them to solve problems in various situations (De
Lange, 1987; Ojose, 2011). Since both the content covered in school and how the in-school
instruction is provided can affect students’ mathematical literacy (Ojose, 2011), different
groups of students can perform differently when applying content-based mathematical
knowledge in real-life situations, depending on their educational environments.
Previous studies have found that the frequency with which children encounter
formal mathematical concepts during class (i.e., Opportunity to Learn, OTL) differs
according to their Socioeconomic Status (SES; Schmidt, Burroughs, Zoido, & Houang,
2015), which affects student performance regarding mathematics literacy. However,
little research has been done on relationships between the curriculum-based mathemat-
ics performance and application-based mathematics literacy, as well as how these
relationships vary by SES. As SES is one of the main factors that are closely related
to the quality of instruction and the amount of content that students receive (Duncan &
Murnane, 2014; Schmidt & McKnight, 2012), educational experiences that involve
transferring content-based mathematics knowledge into the ability to actually apply
mathematics knowledge in real-life situations may be shaped by a student’s SES.
Several associated studies have also found that the focus of mathematics instruction in
schools differs by students’ SES, with lower SES students receiving instruction mainly
consisting of basic computation and number concepts (Anyon, 1981; Bachman, Votruba-
Drzal, El Nokali, & Castle Heatly, 2015; Ladson-Billings, 1997; Means & Knapp, 1991)
instead of the more conceptual, analytic, and higher-order mathematics activities that higher
SES students receive (Desimone & Long, 2010; Georges, 2009; Means & Knapp, 1991).
Further, after examining nationally representative data on instruction from 1987 to 2005,
Camburn and Han (2011) found that lower SES students were offered lessons involving fewer
conceptual problems and problem-solving opportunities. Those students with a higher SES
were given more conceptual problems and problem-solving opportunities, which would, as
discussed above, most likely affect their ability to apply their mathematics knowledge. These
differences in the instructional focus provided students of different SESs can engender
differential relationships between assessments of content-based mathematics (e.g., Trends in
International Mathematics and Science Study [TIMSS]) and assessments of application-based
mathematics competency (e.g., Program for International Student Assessment [PISA]).
The Differential Role of Socioeconomic Status in the Relationship...

To comprehensively understand the relationship between curriculum-based mathe-


matics performance and mathematics literacy in consideration of OTL, this study
examines whether there are differential relationships among mathematics literacy,
curriculum-based mathematics performance, and OTL according to SES group in a
large-scale setting. As the curriculum-based mathematics performance and mathematics
literacy in this study were measured by the TIMSS and PISA mathematics assessments,
respectively, our results also shed light on the direct relationship between these two
major international assessments.

Conceptual Framework

Comparison Between TIMSS and PISA Mathematics Assessment

Both TIMSS and PISA were developed to address the desire to compare national
education systems employing nationally representative student samples. Both programs
provide a perspective on mathematics education yet they have different focuses and
goals and thus have different designs detailed in their unique assessment frameworks
(Neidorf, Binkley, Gattis, & Nohara, 2006; Schmidt, Houang, Cogan, & Solorio, 2019).
The assessment goals of TIMSS and PISA mathematics assessments are distinguish-
able. PISA mathematics assessment aims at measuring the outputs of various countries’
education systems regarding the skills and competencies that adolescents can apply to
real-world contexts. Consequently, PISA’s student population definition focuses on
student who are 15 years old, the point at which compulsory education ends in most
countries (Neidorf et al., 2006; OECD, 2013a). This age-based student population
definition yields samples of students for most countries that are in different grades,
most typically either in Grade 9 or Grade 10. Also, the emphasis on mathematics
literacy makes the PISA assessment blur the meaning of the learning measured as this
may happen outside of school in addition to that from curricula and instruction in
school (OECD, 2013a). On the other hand, TIMSS mathematics assessment concen-
trates on measuring students’ acquisition of school-based curricular mathematics
knowledge. Hence, TIMSS’s sample is based on students’ grade levels (fourth and
eighth graders in each country) to understand the individual education system and
obtain information on mathematics curricula and classroom practices across countries
at the two grade levels.
Regarding measured content and cognitive skills, TIMSS and PISA have similarities
and differences. TIMSS mathematics assessment comprises the two different sets of the
content dimensions, which indicate the topics expected to be taught, depending on the
assessed grade levels. For example, the mathematics content domains for the assess-
ment at Grade 4 consists of three topics of number, geometric shapes and measures, and
data display, whereas the content domains for the mathematics assessment at Grade 8,
which was used in our study in comparison with PISA assessment, are composed of
four domains of number, algebra, geometry, and data and chance (Kastberg, Roey,
Ferraro, Lemanski, & Erberber, 2013). In addition, TIMSS mathematics assessment has
the common three cognitive domains (cognitive skills) of knowing, applying, and
reasoning across the two mathematics assessments for fourth graders and ninth graders
(Kastberg et al., 2013).
H. Kang, L. Cogan

Similarly, PISA also has content and cognitive dimensions. PISA’s four overarching
content ideas are space and shape, change and relationships, quantity, and uncertainty
and data (OECD, 2014). Its three process categories (corresponding to the TIMSS
cognitive domains) are formulate, employ, and interpret (OECD, 2014). In addition to
content and cognitive dimensions, PISA incorporates a context dimension (personal,
occupational, societal, and scientific), which refers to situations in which students can
use and apply their mathematics knowledge (OECD, 2014).
Despite the similarities between the two assessments in mathematics contents, the
differences in their focus are demonstrated in how their domains and dimensions are
defined: while TIMSS focuses on academic content-based mathematics knowledge and
skills, PISA focuses on literacy and real-world relevancy of mathematics. The released
items presented in Table 1 illustrate the differences between the two assessment frameworks
(“PISA Test,” n.d.; “TIMSS 2011,” n.d.). Each pair of items in the panel of Table 1 was
derived from their respective corresponding content area. The first pair of items were from
algebra in TIMSS and change and relationship in PISA, and the second pair of items were
derived from geometry in TIMSS and space and shape in PISA. In addition, the third pair of
items were from number in TIMSS and quantity in PISA, and the fourth pair of items were
from data and chance in TIMSS and uncertainty and data in PISA. These items show that
PISA assessments were more geared toward assessing students’ capability to apply math-
ematical knowledge into real-world contexts.

Measures of Students’ SES Used in TIMSS and PISA

SES is defined as individual’s combined economic and social standing (Baker, 2014).
As a latent construct, SES is measured in somewhat different ways based on how the
construct is operationalized (Baker, 2014; Schmidt et al., 2019). An operational
definition of a students’ SES that is generally utilized in educational studies includes
parent’s educational attainment, occupation, and household income or wealth
(Buchmann, 2002; National Center for Education Statistics [NCES], 2012). As TIMSS
and PISA are designed for international comparison, the two assessments constructed
indicators for SES that function effectively across different social and cultural contexts
(Schmidt et al., 2019). With only slight variations, both TIMSS and PISA have
common sets of items, namely parent’s education, home possession, and educational
resources including books in the home, for measuring SES.
In addition to the common items, the SES measures in TIMSS and PISA differ based
on each assessment’s view of the operationalized definition of and issues in measuring
SES. For instance, TIMSS does not include parent’s occupation because those answers
provided by students’ self-report, even in eighth grade, would not be reliable
(Buchmann, 2002; Schmidt & Cogan, 1996; Schmidt et al., 2019). Also, PISA’s
measure for SES is different from the SES-related items in TIMSS in that PISA utilizes
a single index for SES that is presented as students’ social, economic, and cultural
status (ESCS). The ESCS index is computed based on the combination of students’
response to the highest parent occupation, the highest parent educational background,
and items on home possession that represent home wealth (Schmidt et al., 2019). This
index is standardized with a mean of zero and a standard deviation of one to help
provide information that can be compared at the same scale across participating
countries in PISA (Causa & Chapuis, 2009).
The Differential Role of Socioeconomic Status in the Relationship...

Table 1 Examples of TIMSS and PISA mathematics items

TIMSS item PISA item

If t is a number between 6 and 9, then t + 5 is between The Gotemba walking trail up Mount Fuji is about 9
what two numbers? kilometres (km) long. Walkers need to return from the
(content area: algebra) 18 km walk by 8 pm. Toshi estimates that he can walk
up the mountain at 1.5 kilometres per hour on
average, and down at twice that speed. These speeds
take into account meal breaks and rest times. Using
Toshi’s estimated speeds, what is the latest time he
can begin his walk so that he can return by 8 pm?
(content area: change and relationship)
The perimeter of a square is 36 cm. What is the area This is the plan of the apartment that George’s parents
of this square? want to purchase from a real estate agency. To
(content area: geometry) estimate the total floor area of the apartment
(including the terrace and the walls), you can measure
the size of each room, calculate the area of each one,
and add all the areas together. However, there is a
more efficient method to estimate the total floor area
where you only need to measure 4 lengths. Mark on
the plan above the four lengths that are needed to
estimate the total floor area of the apartment.
(content area: space and shape)
Ann and Jenny divide 560 zeds between them. If The door makes 4 complete rotations in a minute.
Jenny gets 38 of the money, how many zeds will Ann There is room for a maximum of two people in each
get? of the three door sectors. What is the maximum
(content area: number) number of people that can enter the building through
the door in 30 minutes?
(content area: quantity)
A machine has 100 candies and dispenses a candy In January, the new CDs of the bands 4U2Rock and
when a lever is turned. The machine has the same The Kicking Kangaroos were released. In February,
number of blue, pink, yellow, and green candies all the CD s of the bands No One’s Darling and The
mixed together. Megan turned the lever and obtained Metalfolkies followed. The following graph shows the
a pink candy. Peter turned the lever next. How likely sales of the bands’ CDs from January to June. The
is it that Peter will get a pink candy? manager of The Kicking Kangaroos is worried
(content area: data and chance) because the number of their CDs that sold decreased
from February to June. What is the estimate of their
sales volume for July if the same negative trend
continues?
(content area: uncertainty and data)

The Relationships Among SES, OTL, and Mathematics Performance

Empirical evidence buttresses the idea that SES is associated with student mathematics
achievement not only in the USA but also in other countries that participated in TIMSS
and PISA (Chudgar & Luschei, 2009; Schmidt et al., 2015). High SES students are
more likely to have more educational support from parents (Putnam, 2015) and more
qualified teachers (Boyd, Grossman, Lankford, Loeb, & Wyckoff, 2009). Moreover,
high SES students have more access to mathematics in terms of instruction time and
scope of content coverage (Schmidt & McKnight, 2012; Wenglinsky, 2002). The
unevenly distributed educational opportunities among students from different SES
exacerbate existing achievement gaps.
Specifically, OTL varies by SES, and these discrepancies are related to different
mathematics performance levels (Schmidt et al., 2001; Schmidt et al., 2015). OTL
H. Kang, L. Cogan

refers to students’ exposure to studying particular topics in classroom instruction


(OECD, 2014) and is an important measure indicating the quality of the education that
students receive (Wang, 1998). To address this important aspect of quality of school-
ing, PISA emphasized the measurement of OTL in their 2012 assessment (She, Stacey,
& Schmidt, 2018).
The relationships among SES, OTL, and mathematics performance are consistently
observed in both curriculum-based mathematics performance (TIMSS) and mathemat-
ics literacy (PISA). A study using TIMSS found that SES is positively associated with
OTL (i.e., exposure to mathematics content; Schmidt, Burroughs, & Houang, 2012).
Schmidt et al. (2001) found that the relationship between TIMSS mathematics score
and OTL are positively correlated. Schmidt et al. (2015) also found that OTL in
mathematics mediates the impacts of SES on different achievement levels in PISA
mathematics assessments. Using longitudinal data from Russia TIMSS and PISA
assessments on the same set of students, Carnoy, Khavenson, Loyalka, Schmidt, and
Zakharov (2016) found a positive relationship between OTL and mathematics literacy.
Even so, the magnitude of that relationship was smaller than the magnitude of the
relationship between OTL and PISA that has been found in studies when cross-
sectional data was used (Carnoy et al., 2016).

Research Questions

Our goal in this study was to examine comprehensively the structural relationships
between curriculum-based mathematics performance, mathematics literacy, and OTL
over time as well as whether, and how, SES plays a role in students’ later application-
based mathematics literacy. More specifically, our research questions are as follows:

Research question 1: For both curriculum-based mathematics performance and


application-based mathematics performance, to what degree
does the variation in student OTL explain the relationship
between mathematics performance and SES compared to the
extent that SES variability explains the relationship between
mathematics performance and OTL?
Research question 2: Does the relationship between students’ performance in
curriculum-based mathematics in year 1 and application-
based mathematics in year 2, in which students’ OTL in
each corresponding school year is controlled for, differ by
students’ SES?

Data

Sample

To investigate the relationship between content-based mathematics achievement and


knowledge of applying mathematics in real contexts, we used the unique Russian
Federation longitudinal data in which the same group of students participated in both
the TIMSS 2011 and PISA 2012 mathematics assessments. This longitudinal sample is
The Differential Role of Socioeconomic Status in the Relationship...

not identical to either the official TIMSS 2011 or the official PISA 2012 Russian
Federation samples included in these respective reports. The nationally representative
longitudinal sample in this study was identified initially employing the same TIMSS
2011 administration rules and followed the PISA 2012 administration guidelines 1 year
later. Consequently, this unique longitudinal data set is distinguished from other
TIMSS and PISA data sets and provides a unique opportunity to examine the relation-
ship between these two assessments. In addition, as the same students took both the
TIMSS and the PISA assessments, students’ demographic backgrounds which are
reflected in the SES measures are held constant. This consistency in the data ensures
that the SES grouping used in this study is reliable across two time points and,
consequently, enables a truly longitudinal examination of the SES relationships at the
student level.
The students in the Russian Federation longitudinal data set were initially identified
using the guidelines and administration rules for the official Russian Federation eighth-
grade TIMSS 2011 student sample. Then, 1 year later, the PISA 2012 mathematics
assessment was distributed to this same group of students. As this nationally represen-
tative group of students was selected based on grade level (i.e., eighth graders in 2011),
this group of students was not the same as the official Russia PISA 2012 sample which
was nationally representative of 15-year-old adolescents.
Given that PISA samples 15-year-olds, participating students in each country and
economy vary at their grade levels, ranging from Grade 7 to Grade 12, except in Japan
and Iceland where all 15-year-old students participated in the assessment were in Grade
10 (OECD, 2014). In the case of the official Russia PISA 2012, when decomposed by
grade levels, the majority of the sample was composed of ninth graders (73.8%),
followed by 10th (17.4%), eighth (8.1%), seventh (0.6%), and 11th graders (0.1%;
OECD, 2014). On the other hand, in our unique data set, the sample of eighth graders
for TIMSS 2011 also responded to the PISA 2012, 1 year later while they were in
Grade 9. Hence, this data set distinguished itself from the official Russia PISA 2012
and is true longitudinal data that enables the examination of the relationship between
content-based mathematics represented as TIMSS and mathematics literacy represented
as PISA at two points in time.
Also, more than 80% of students in Grade 8 also studied with the same teacher when
they were in Grade 9, and almost all students were in the same classroom in Grade 9 as
they were in Grade 8 (Carnoy et al., 2016). Specifically, in the spring of 2011, 4893
students in 231 classrooms from 210 schools participated in TIMSS mathematics
assessment. Then, in the spring of 2012, about 90% of the original group (4399
students in 229 classes from 209 schools) took the PISA 2012 assessment (Carnoy
et al., 2016). Despite the attrition rate, the sensitivity analysis comparing the means of
student characteristics and TIMSS test scores between the baseline and end samples
suggest that attrition is not major a concern for results bias in the analysis (Carnoy et al.,
2016).

Variables

In the analysis, we used student-level variables in the two assessments. For SES, we
utilized TIMSS’s SES variables of parent’s education, home possession, and educa-
tional resources and constructed a standardized composite indicator. Then, the upper
H. Kang, L. Cogan

25%, lower 25%, and middle 50% of students on the SES were defined as high, low,
and middle SES groups. Hence, 1348, 2700, and 845 students were categorized into
low, middle, and high SES groups, respectively. We used this SES measure because
TIMSS was conducted in year 1 (at Grade 8) in this longitudinal study; hence, this
variable in TIMSS better indicates the baseline SES conditions for students (Carnoy
et al., 2016; Schmidt et al., 2015).
Second, OTL in PISA assessment is measured according to students’ familiarity
with formal mathematics items (topics). Each OTL item is measured on a 5-point Likert
scale ranging from 0 to 4 to indicate the degree of familiarity with a certain topic
encountered in school as represented by a particular item. This measure better reflects
the conception of OTL covered in previous studies (Schmidt et al., 2015), and it is more
closely related to student achievement than the measures labeled as OTL by PISA.
Also, this measure of familiarity with formal mathematics topics demonstrated greater
variation among students in each of the participating PISA 2012 countries (OECD,
2013b).
Based on the topics covered in the classroom at different grade levels, we were able
to construct OTL in year 1 (topics covered before Grade 8 and in Grade 8) and OTL in
year 2 (topics covered post Grade 8) that capture students’ different OTL levels in the
different time points that were included in our research model. Out of 13 items, the four
topics of exponential function, quadratic function, vectors, and cosine are the topics
covered after Grade 8 (OTL in year 2), and the remaining nine items are topics related
to before and in Grade 8 (OTL in year 1). Additionally, students’ responses to each
item were weighted by the item’s corresponding international grade placement (IGP)
weight. The IGP was constructed from the international curriculum analysis data
collected for the 1996 TIMSS and represents the grade level at which each topic has
the most instructional focus from an international perspective. Given the hierarchical
nature of school mathematics, the IGP may be considered to reflect the curricular rigor
of each concepts (Cogan, Schmidt, & Guo, 2019; Schmidt et al., 2001). As an
instructional introduction of more advanced mathematics topics occurs in later grades
after more basic topics have been covered as stepping stones, more advanced topics
have a higher estimate of IGP that indicates cumulative rigor associated with the
content difficulty.

Analytic Methods

We first conducted a regression analysis to explore the relationship among mathematics


performance, OTL, and SES and to see how much variations in TIMSS and PISA
mathematics performance are explained by OTL, SES, and the interaction between
OTL and SES. Then, to examine moderation effects of students’ SES on the relation-
ships among curriculum-based mathematics, (TIMSS) application-based mathematics,
(PISA) and OTL, multigroup structural equation modeling (SEM) was used. Multi-
group SEM is a useful way to test the moderation effect while also considering
complex relations among variables in a model and provides information on similar
and different patterns among different groups (Kline, 1998).
The multigroup SEM analysis proceeded as follows. First, a research model showing
the relationships among TIMSS, PISA, and OTL was developed based on previous
The Differential Role of Socioeconomic Status in the Relationship...

studies (Fig. 1). As the TIMSS mathematics test was measured in time 1 and the PISA
mathematics test was measured in time 2, OTL at the student level was divided into two
parts that each represent OTL in time 1 and time 2. The two OTL variables relevant to
the different time points when the assessments were administered provided the preced-
ing OTL conditions for the TIMSS performance and allowed us to obtain better
estimates for the relationship between OTL and the two types of mathematics perfor-
mance across time.
Next, a chi-square difference test was conducted to compare the model fit for the
null model that constrained all five coefficients equal across SES groups with that of an
alternative model that allowed the path 1 in the research model (Fig. 1) to be freely
estimated for each SES group. The test provides results about whether the null model
significantly fits the data better or worse than the nested alternative model. When the
chi-square difference test is significant, it indicates that the alternative model fits the
data better than the null model.

Results

The descriptive statistics for the variables and their correlations are presented in
Table 2. As seen in the table, the standardized SES measure has a mean of 0 and a
standard deviation of 1. Based on the percentiles of the distribution of this continuous
SES variable, we were able to identify low, middle, and high SES groups of students
for the multigroup SEM analysis. In addition, all the correlation coefficients in Table 2
were statistically significant. TIMSS and PISA have a strong and positive correlation
(r = .65), and OTL variables in years 1 and 2 have positive correlations with both
TIMSS and PISA, ranging from .24 to .31.
Table 3 indicates means and standard deviations for variables used in this study by
different SES groups. Higher SES groups have higher performance levels in both
TIMSS and PISA mathematics scores as well as increased OTL across time. For
example, the average TIMSS mathematics test scores for low, middle, and high SES
students are 517.7, 545.3, and 575.9, respectively, showing a statistical difference (F =
150.2, p < .05). PISA scores also differ by SES groups (F = 144, p < .05), and the

Fig. 1 Research model showing the relation of OTL with curriculum- and application-based mathematics
H. Kang, L. Cogan

Table 2 Descriptive statistics for variables and the correlations among the variables

M SD TIMSS PISA OTL year 1 OTL year 2 SES

TIMSS 543 79.6 –


PISA 492 85.1 .65* –
OTL year 1 2.76 0.70 .24* .27* –
OTL year 2 2.74 0.66 .25* .31* .78* –
SES 0 1 .28* .28* .21* .20* –

*p < .05

average PISA mathematics scores for low, middle, and high SES students are 463.6,
495.3, and 527.9, respectively. In addition, regarding OTL’s relevance to topics
covered before and in Grade 8, the three SES groups have significantly different levels
(F = 55.09, p < .05)—the low SES group has OTL scores of 2.55, while the middle
(M = 2.81) and high SES groups (M = 2.92) have higher levels of OTL scores. Students
in low (M = 2.57), middle (M = 2.78), and high (M = 2.91) SES groups also have
different levels of OTL for the topics covered post Grade 8 (F = 47.09, p < .05).
To answer research question 1, we present a regression analysis to examine the
overall relationships between OTL, SES, and mathematics test scores (TIMSS and
PISA). As seen in Models 1 and 3 in Table 4, a one unit increase in OTL was associated
with a 31.69 (β = 31.69, p < .05) increase in TIMSS mathematics test scores, and the
regression coefficient decreased to 26.06 (β = 26.06, p < .05) after including SES and
the interaction between OTL and SES. On the other hand, the regression coefficient for
SES decreased from 22.46 to 11.81 after including OTL and the interaction term. The
results show that about 17.79% of the initial association between OTL and curriculum-
based mathematics performance (TIMSS) was explained by SES, while approximately
47.42% of the association between SES and curriculum-based mathematics perfor-
mance was accounted for by OTL.
In addition, a one-unit increase in OTL corresponds to an increase of 40.39 in PISA
mathematics scores (β = 40.39, p < .05), and the association was reduced to 35.22 after
controlling for SES (β = 35.22, p < .05) as presented in Models 4 and 6. Also, the
regression coefficient that represents the relationship between SES and PISA (β =
24.33, p < .05) in Model 5 became statistically not significant after controlling for OTL

Table 3 Means and standard deviations for TIMSS, PISA, and OTL scores by SES group

TIMSS mathematics PISA mathematics OTL year 1 OTL year 2

M SD M SD M SD M SD

Low SES 517.7 78.82 463.6 80.57 2.55 0.77 2.57 0.71
(n = 1348)
Middle SES 545.3 77.87 495.3 82.80 2.81 0.67 2.78 0.63
(n = 2700)
High SES 575.9 72.61 527.9 84.23 2.92 0.61 2.91 0.60
(n = 845)
The Differential Role of Socioeconomic Status in the Relationship...

Table 4 Results of multiple regression analysis

Model 1 Model 2 Model 3 Model 4 Model 5 Model 6

TIMSS
OTL 31.69* 26.06*
SES 22.46* 11.81*
OTL*SES 2.37
PISA
OTL 40.39* 35.22*
SES 24.33* 6.50
OTL*SES 4.23

*p < .05

and the interaction between SES and OTL (β = 6.50, p > .05). The results suggest that
approximately 13% of the initial relationship between OTL and mathematics literacy
(PISA) is explained by SES, and the association between SES and PISA was fully
explained by OTL.
Table 5 indicates a comparison of the chi-square difference test between the null
model, which constrains all coefficients to be invariant among the low, middle, and
high SES groups, and the alternative model, which permits path 1 in Fig. 1 to be freely
estimated across different groups. The chi-square difference test, which tests the
difference between the chi-square statistic for the null and alternative models, was
statistically significant (Δχ2 = 10.56, Δdf = 2, p < .01). The significant difference
indicates that a model with more free parameters (alternative model in this study) is a
better fit to the data than a model with fixed parameters (null model in this study). In
other words, the alternative model with a smaller chi-square value is favored over the
null model. In addition, the alternative model has better model fit indices (RMSEA =
.008, SRMR = .02) compared with the model fit indices for the null model (RMSEA =
.03, SRMR = .05).1 The results of the chi-square difference test and model fit statistics
support the rejection of the null hypothesis, in which all coefficients are equal across
groups, and the selection of the alternative model as the final model in our study.
In other words, comparison of the two models suggests that the relationship between
performance in curriculum-based mathematics and performance in application-based
mathematics literacy in the model differs across by SES (research question 2). As seen
in Fig. 2, the unstandardized path coefficients from curriculum-based mathematics
(TIMSS) to application-based mathematics literacy (PISA) are 0.58 for low SES,
0.64 for middle SES, and 0.74 for high SES students. That is, the relationship between
previous years’ curriculum-based mathematics knowledge and next year’s application-
based mathematics literacy is leaning toward becoming stronger for higher SES
students.

1
Root Mean Square Error of Approximation (RMSEA) and Standardized Root Mean Square Residual
(SRMR) are model fit indices widely used for assessing structural equation models. In these two model fit
statistics, values close to 0 indicate a better model fit. In general, RMSEA values lower than .06 and SRMR
values lower than .08 are considered a good model fit (Hu & Bentler, 1999).
H. Kang, L. Cogan

Table 5 Chi-square difference test comparison between null and alternative models

Model Coefficient freely estimated across group χ2 df RMSEA SRMR

Alternative TIMSS→PISA (1) 11.72 11 .008 .02


Null All path coefficients constrained to be equal 22.28 13 .03 .05

Discussion

Congruent with previous studies, the findings in this study also support that an increase
in students’ OTL in mathematics is closely connected to their improved mathematics
performance. Also, students’ OTL explains much variability in both curriculum-based
and application-based mathematics performance. The reduction rates of regression
coefficients were larger in SES than OTL for both TIMSS and PISA, indicating that
OTL explained more variability in the association between SES and mathematics
performance than the variability that SES accounts for in the association between
OTL and mathematics performance. Our findings also offer new evidence on the
differential relationship between TIMSS and PISA that depends on students’ SES.
The results suggest that students in higher SES groups have the advantage of higher
improvement in application-based mathematics literacy over time even when they had
the same level of curriculum-based mathematics knowledge and skills.
Because the focuses of TIMSS and PISA are different, how students perform in the
PISA and TIMSS assessments can differ depending on whether their teachers concen-
trate on formal mathematics or applied mathematics. In light of this condition, some
countries ranked in a different position among participating countries between their
TIMSS and PISA assessments due to their curricula and instructional focuses
(Loveless, 2013). For instance, New Zealand and Finland performed better on their
PISA assessment, with which their curricula are designed more toward application
based. New Zealand scored 27 points (about ¼ standard deviation) lower than Korea on

Fig. 2 Multigroup SEM model to test moderation effect of SES.


Note: L: low SES group; M: middle SES group; H: high SES group. Coefficients are unstandardized. Standard
errors are in parentheses. All coefficients are statistically significant at p < .05
The Differential Role of Socioeconomic Status in the Relationship...

the PISA mathematics assessment, while on the TIMSS, the two countries showed a
substantial 125-point difference which was approximately one and ¼ standard devia-
tion difference (488 for New Zealand and 613 for Korea). Also, Finland scored only 5
points lower than Korea on the PISA, but 99 points lower than Korea on the TIMSS
(Loveless, 2013).
The comparisons between TIMSS and PISA scores in a country generally have
2 years of differences, because TIMSS’s population comprises eighth graders while
PISA’s population comprises 15-year-olds, most of whom are in Grade 10 in many
PISA-participating countries (OECD, 2014). On the other hand, our study was able to
investigate how students’ performance in TIMSS and PISA change over a 1-year
period depending on their SES, comprehensively considering the OTL corresponding
to each time point when students took the test. In this sense, our study is distinguished
from these simple comparisons and contributes to the field.
Specifically, this study found that students’ SES moderates the relationship between
curriculum-based mathematics performance (TIMSS) and application-based mathemat-
ics literacy (PISA). For higher SES students, their previous curriculum-based mathe-
matics knowledge had stronger associations with their ability to apply knowledge in
real-life situations. The differential relationships occurred across all three SES groups
of students. Low SES students have less association between TIMSS and PISA scores
as compared with the association with middle and high SES students. This finding
indicates that having equivalent curriculum-based knowledge could possibly transfer at
different magnitudes to the application of mathematics knowledge in real-life contexts
in different SES.
These intensified performance gaps developing among students by SES status over
time should be interpreted with caution. Given that OTL plays an important role in
predicting student performance in both curriculum-based mathematics performance and
application-based mathematics performance, the differential relationships by SES
should not be attributed to the nature of students within each SES group because these
are often allegedly thought of as deficiencies of minority students (McKay & Devlin,
2016). Rather, this deficit view hinders mathematics learning of the students with low
SES backgrounds as the students are hampered in receiving appropriate support for
their needs which is likely due, in part, to educators’ lowered expectations of the
capabilities of these groups of students (McKay & Devlin, 2016). In this sense, school
leaders and teachers should identify and tackle the aspects of schooling (e.g., distribu-
tion of educational resources and quality of teachers) that cause systematic inequalities
rooted in this deficit perspective against ensuring all students’ mathematics learning.
Moreover, as mathematics literacy is essential in the twenty-first century for indi-
viduals to prepare for their role in society and to lead an independent life, every student
should have the opportunity to develop their mathematics literacy competencies. To
that end, educators and policy makers need to identify factors that hinder lower SES
students’ advancement in mathematics literacy as well as structural barriers that cause
their increased inequality in mathematics literacy over time, even when they previously
have equivalent levels of curriculum-based mathematics knowledge and skills. In
addition, teachers need to be aware of their role in developing students’ mathematical
thinking and skills and cautiously pay attention to how they deliver their daily
instruction using their language and discourse to promote learning goals in mathematics
(Frade, Acioly-Régnier, & Jun, 2012).
H. Kang, L. Cogan

This research illuminates avenues of research for future studies to ensure mathemat-
ics literacy for all students regardless their SES backgrounds, such as which factors
may explain this moderation effect of SES on predicting mathematics literacy (PISA)
with content-specific mathematics performance (TIMSS). To further examine the
differential relationship between TIMSS and PISA among SES groups, subsequent
analyses including other factors related to student mathematics performance and OTL
(i.e., teacher quality, instructional focus, and student motivation in relation to teachers’
expectation to specific groups of students) could be done in future studies. In addition,
future studies can comprehensively examine whether and to what degree these differ-
ential relations by SES are explained by different educational opportunities in class-
room and home environments for different SES groups, as well as the interaction
between these two environments.

Acknowledgments This paper was prepared under the tutelage of William H. Schmidt at Michigan State
University.

References

Anyon, J. (1981). Social class and school knowledge. Curriculum Inquiry, 11(1), 3–42. https://doi.org/10.
1080/03626784.1981.11075236.
Bachman, H. J., Votruba-Drzal, E., El Nokali, N. E., & Castle Heatly, M. (2015). Opportunities for learning
math in elementary school: Implications for SES disparities in procedural and conceptual math skills.
American Educational Research Journal, 52(5), 894–923. https://doi.org/10.3102/0002831215594877.
Baker, E. H. (2014). Socioeconomic status, definition (pp. 2210–2214). The Wiley Blackwell Encyclopedia of
Health, Illness, Behavior, and Society. https://doi.org/10.1002/9781118410868.wbehibs395.
Boyd, D., Grossman, P. L., Lankford, H., Loeb, S., & Wyckoff, J. (2009). Teacher preparation and student
achievement. Educational Evaluation and Policy Analysis, 31(4), 416–440. https://doi.org/10.3386/
w14314.
Buchmann, C. (2002). Measuring family background in international studies of education: Conceptual issues
and methodological challenges. In A. C. Porter & A. Gamoran (Eds.), Methodological advances in cross-
national surveys of educational achievement (pp. 150–197). Washington, DC: National Academies Press.
Camburn, E. M., & Han, S. W. (2011). Two decades of generalizable evidence on US instruction from
national surveys. Teachers College Record, 113(3), 561–610.
Carnoy, M., Khavenson, T., Loyalka, P., Schmidt, W. H., & Zakharov, A. (2016). Revisiting the relationship
between international assessment outcomes and educational production: Evidence from a longitudinal
PISA-TIMSS sample. American Educational Research Journal, 53(4), 1054–1085. https://doi.org/10.
3102/0002831216653180.
Causa, O., & Chapuis, C. (2009). Equity in student achievement across OECD countries: An investigation of
the role of policies. (OECD Economics Department Working Papers, No. 708). Paris, France: OECD
Publishing.
Chudgar, A., & Luschei, T. F. (2009). National income, income inequality, and the importance of schools: A
hierarchical crossnational comparison. American Educational Research Journal, 46(3), 626–658. https://
doi.org/10.3102/0002831209340043.
Cogan, L. S., Schmidt, W. H., & Guo, S. (2019). The role that mathematics plays in college-and career-
readiness: Evidence from PISA. Journal of Curriculum Studies, 51(4), 530–553. https://doi.org/10.1080/
00220272.2018.1533998.
De Lange, J. (1987). Mathematics, insight and meaning. Utrecht. the Netherlands: OW & OC, Utrecht
University.
Desimone, L. M., & Long, D. (2010). Teacher effects and the achievement gap: Do teacher and teaching
quality influence the achievement gap between Black and White and high-and low-SES students in the
early grades? Teachers College Record, 112(12), 3024–3073.
The Differential Role of Socioeconomic Status in the Relationship...

Duncan, G. J., & Murnane, R. J. (2014). Restoring opportunity: The crisis of inequality and the challenge for
American education. Cambridge, MA: Harvard Education Press.
English, L. D. (2002). Priority themes and issues in international research on mathematics education. In L. D.
English (Ed.), Handbook of international research in mathematics education (pp. 3–16). Mahwah, NJ:
Lawrence Erlbaum.
Frade, C., Acioly-Régnier, N., & Jun, L. (2012). Beyond deficit models of learning mathematics: Socio-
cultural directions for change and research. In M. A. (. K.). Clements, A. J. Bishop, C. Keitel, J.
Kilpatrick, & F. K. S. Leung (Eds.), Third International Handbook of Mathematics Education (vol. 27,
pp. 101–144). New York, NY: Springer.
Georges, A. (2009). Relation of instruction and poverty to mathematics achievement gains during kindergar-
ten. Teachers College Record, 111(9), 2148–2178.
Hu, L.-t., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional
criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55. https://doi.org/10.1080/
10705519909540118.
Kastberg, D., Roey, S., Ferraro, D., Lemanski, N., & Erberber, E. (2013). US TIMSS and PIRLS 2011
technical report and user’s guide. NCES 2013-046. Washington, DC: National Center for Education
Statistics.
Kline, R. B. (1998). Principles and practice of structural equation modeling. New York, NY: Guilford.
Ladson-Billings, G. (1997). It doesn’t add up: African American students’ mathematics achievement. Journal
for Research in Mathematics Education, 28(6), 697–708. https://doi.org/10.2307/749638.
Loveless, T. (2013). International tests are not all the same. Retrieved February 8, 2019, from https://www.
brookings.edu/research/international-tests-are-not-all-the-same/.
McKay, J., & Devlin, M. (2016). ‘Low income doesn't mean stupid and destined for failure’: Challenging the
deficit discourse around students from low SES backgrounds in higher education. International Journal
of Inclusive Education, 20(4), 347–363. https://doi.org/10.1080/13603116.2015.1079273.
Means, B., & Knapp, M. S. (1991). Cognitive approaches to teaching advanced skills to educationally
disadvantaged students. Phi Delta Kappa, 73(4), 282–289.
National Center for Education Statistics. (2012). Improving the measurement of socioeconomic status for the
national assessment of educational progress: A theoretical foundation. Retrieved March 15, 2019, from:
https://nces.ed.gov/nationsreportcard/pdf/researchcenter/Socioeconomic_Factors.pdf.
Neidorf, T. S., Binkley, M., Gattis, K., & Nohara, D. (2006). Comparing mathematics content in the National
Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study
(TIMSS), and Program for International Student Assessment (PISA) 2003 Assessments (NCES 2006-
029). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved
March 21, 2019, from http://nces.ed.gov/pubsearch.
Ojose, B. (2011). Mathematics literacy: Are we able to put the mathematics we learn into everyday use.
Journal of Mathematics Education, 4(1), 89–100.
Organisation for Economic Co-operation and Development (1999). Measuring student knowledge and skills:
A new framework for assessment. Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development (2013a). PISA 2012 assessment and analytical
framework: Mathematics, reading, science, problem solving and financial literacy. Paris, France: OECD
Publishing.
Organisation for Economic Co-operation and Development (2013b). PISA 2012 results: Excellence through
equity: Giving every student the chance to succeed (Volume II). Paris, France: OECD Publishing.
Organisation for Economic Co-operation and Development (2014). In PISA 2012 results: What students know
and can do-student performance in mathematics, reading and science (Volume I, Revised edition,
February 2014). Paris, France: OECD Publishing.
PISA Test (n.d.). Retrieved May 10, 2019, from http://www.oecd.org/pisa/test/PISA%202012%20items%
20for%20release_ENGLISH.pdf.
Putnam, R. D. (2015). Our kids: the American dream in crisis (First Simon & Schuster hardcover ed.). New
York, NY: Simon & Schuster.
Schmidt, W. H., Burroughs, N. A., & Houang, R. T. (2012, May). Examining the relationship between two
types of SES gaps: Curriculum and achievement. In Paper presented at the income, inequality, and
educational success: new evidence about socioeconomic outcomes conference, Stanford, CA.
Schmidt, W. H., Burroughs, N. A., Zoido, P., & Houang, R. T. (2015). The role of schooling in perpetuating
educational inequality: An international perspective. Educational Researcher, 44(7), 371–386. https://doi.
org/10.3102/0013189X15603982.
H. Kang, L. Cogan

Schmidt, W. H., & Cogan, L. S. (1996). Development of the TIMSS context questionnaires. In M.O. Martin &
D.L. Kelly (Eds.), Third International Mathematics and Science Study: Technical Report (vol. Volume I:
Design and Development, pp. 5-1–5-22). Chestnut Hill, MA: Boston College.
Schmidt, W. H., Houang, R. T., Cogan, L. S., & Solorio, M. L. (2019). Schooling across the globe: What we
have learned from 60 years of mathematics and science international assessments. Cambridge, England:
Cambridge University Press.
Schmidt, W. H., & McKnight, C. C. (2012). Inequality for all: the challenge of unequal opportunity in
American schools. New York: Teachers College Press.
Schmidt, W. H., McKnight, C. C., Houang, R. T., Wang, H., Wiley, D. E., Cogan, L. S., & Wolfe, R. G.
(2001). Why schools matter: A cross-national comparison of curriculum and learning. San Francisco,
CA: Jossey-Bass.
She, H. C., Stacey, K., & Schmidt, W. H. (2018). Science and mathematics literacy: PISA for better school
education. International Journal of Science and Mathematics Education, 16(1), 1–5. https://doi.org/10.
1007/s10763-018-9911-1.
TIMSS 2011 Released Items (n.d.). Retrieved May 15, 2019, from https://urldefense.com/v3/__https://
timssandpirls.bc.edu/timss2011/international-released-items.html__;!!HXCxUKc!ndTU-y-lCf2YU_
X8Ufl28IF5UILjy1iaGUIe%2D%2D2KdqjYnhk5etqyq14WB5mCBb4$.
Wang, J. (1998). Opportunity to learn: The impacts and policy implications. Educational Evaluation and
Policy Analysis, 20(3), 137–156. https://doi.org/10.3102/01623737020003137.
Wenglinsky, H. (2002). How schools matter: The link between teacher classroom practices and student
academic performance. Education Policy Analysis Archives, 10(2). https://doi.org/10.14507/epaa.v10n12.
2002.
Yore, L. D., Pimm, D., & Tuan, H. L. (2007). The literacy component of mathematical and scientific literacy.
International Journal of Science and Mathematics Education, 5(4), 559–589. https://doi.org/10.1007/
s10763-007-9089-4.

Affiliations

Hana Kang 1 & Leland Cogan 2


1
Graduate School of Education, University of California Riverside, Riverside, CA, USA
2
Center for the Study of Curriculum Policy, Michigan State University, East Lansing, MI, USA

You might also like