Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/345005843

The Evolving State of the OECD and PISA

Chapter · September 2020


DOI: 10.1007/978-981-15-8285-1_2

CITATIONS READS

2 108

1 author:

Steven Lewis
Australian Catholic University
68 PUBLICATIONS   1,248 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

New data-driven modes of standardised assessment and international evidence, and the reshaping of local schooling reform: An international study View project

THE GLOBAL HISTORY OF THE OECD IN EDUCATION View project

All content following this page was uploaded by Steven Lewis on 05 November 2020.

The user has requested enhancement of the downloaded file.


Chapter 2
The Evolving State of the OECD
and PISA

Abstract This chapter positions PISA for Schools in the context of the OECD’s
broader educational policy work and the Organisation’s evolution into the global
expert for education, albeit one with a decidedly economic underpinning. It pro-
vides a useful introduction to the empirical dimensions of my research, and the
following contextualisation of PISA for Schools shows how the OECD has become
an active education policy actor in its own right. This is significant if we understand
PISA for Schools as helping to constitute new spaces and relations of global
educational governance, in which the OECD’s already considerable policy ‘reach’
can be extended beyond national schooling systems to include the local schools that
participate in the assessment. First, I document a brief history of the OECD, and the
changing role and significance of education within the Organisation. I then discuss
the main PISA assessment, and how PISA’s success has enabled the OECD to exert
a more telling influence upon education discourses and policymaking processes.
I close the chapter by providing a comprehensive description of both the assessment
itself and the report received by participating schools.

Introduction

This chapter positions PISA for Schools in the context of the OECD’s broader
educational policy work and the Organisation’s evolution into the global expert, or
éminence grise (Rinne, Kallo, & Hokka, 2004), for education, albeit one with a
decidedly economic underpinning. Indeed, it is well worth pausing to consider that
an organisation with historical roots in the reconstruction of a devastated post-war
Europe has since become, through PISA and its associated policy work, arguably
the world’s leading ‘centre of calculation’ (Latour, 1987) for schooling perfor-
mance. This chapter provides a useful introduction to the empirical dimensions of
my research, and the following contextualisation of PISA for Schools shows how
the OECD has become an active education policy actor in its own right (see Henry,

© Springer Nature Singapore Pte Ltd. 2020 13


S. Lewis, PISA, Policy and the OECD,
https://doi.org/10.1007/978-981-15-8285-1_2
14 2 The Evolving State of the OECD and PISA

Lingard, Rizvi, & Taylor, 2001). This is significant if we understand PISA for
Schools as helping to constitute new spaces and relations of global educational
governance, in which the OECD’s already considerable policy ‘reach’ can be
extended beyond national schooling systems to include the local schools that par-
ticipate in the assessment.
The chapter begins by first documenting a brief history of the OECD, and the
changing role and significance of education within the Organisation more broadly.
I then discuss the development of the main PISA assessment, and how the success
of PISA has contributed to greater prominence for the Directorate for Education and
Skills within the OECD, enabling the Organisation to exert a more telling influence
upon global education discourses and policymaking processes. Given that PISA for
Schools forms the empirical basis for my study, this chapter closes by providing a
comprehensive description of both the assessment itself and the report received by
participating schools.

The OECD and Its Education Policy Work

Since transitioning in September 1961 from the Organisation for European


Economic Cooperation (OEEC), a US Marshall Plan-funded intergovernmental
body that sought to facilitate economic reconstruction in post-war Europe, the
OECD has adopted many different forms and functions (Ydesen, 2019). Indeed, it
has been variously described as ‘a geographic entity, an organisational structure, a
policymaking forum, a network of policymakers, researchers and consultants, and a
sphere of influence’ (Henry et al., 2001, p. 7). Despite these shifting attributions, the
self-declared raison d’être of the OECD—formally an intergovernmental organi-
sation of the world’s most developed nation-states—has always retained a decid-
edly economic orientation, helping governments to ‘foster prosperity and fight
poverty through economic growth and financial stability’ (OECD, 2019a, np). For
instance, Article One of the OECD Convention, the foundational document signed
in 1960 to bring the Organisation into existence, stipulates that OECD members
must adopt government policies that promote economic strength and prosperity, in
order to
(a) Achieve the highest sustainable economic growth and employment and a rising
standard of living in Member countries, while maintaining financial stability,
and thus to contribute to the development of the world economy;
(b) Contribute to sound economic expansion in Member as well as non-member
countries in the process of economic development; and
(c) Contribute to the expansion of world trade on a multilateral, non-discriminatory
basis in accordance with international obligations (OECD, 2019b, np).
We can see here the OECD’s dominant economic focus and, in spite of more
contemporary developments, the total omission of any initial reference to education.
The OECD and Its Education Policy Work 15

It is perhaps telling that the terms ‘school’ and ‘education’ (and their derivatives)
are absent from the OECD Convention, a document that establishes no educational
priorities beyond a cursory commitment for its signatories to promote science and
technology resources through ‘vocational training’ (OECD, 2019b).
Unlike the exclusively European membership of its OEEC predecessor, the
OECD has alternatively sought a geographically varied, and more recently global,
participation.1 From 1961, at which time the U.S. and Canada were the sole
non-European members, multiple non-European countries have since acceded to the
OECD, including Japan (1964), Australia (1971), New Zealand (1973), Mexico
(1994), Korea (1996), Chile and Israel (2010) and, most recently, Colombia (2020).
37 nation-states presently comprise the OECD membership, with additional ‘affil-
iated’ or non-member economies.2 Moreover, a programme of ‘enhanced engage-
ment’ has been sought with the emerging economies of Brazil, China, India,
Indonesia and South Africa (OECD, 2013), effecting a more ‘global’ OECD to
better align with a ‘globalising’ world economy. Regardless of this recent spatial
and cultural diversity, prospective members are still required to demonstrate an
ideological stance that resonates with that promoted by the OECD, emphasising a
commitment to the free market economy, pluralist democratic institutions and the
guarantee of individual liberties (Henry et al., 2001). This was initially associated
with geopolitical tensions between ‘Western’ countries and the Eastern Bloc during
the Cold War, and the OECD in many respects was considered an ‘economic
equivalent of NATO’ (Woodward, 2009, p. 63), that is, a bulwark of democratic
and capitalist nations against possible encroachment by the communist ‘other’.
However, the existence of a post-Cold War OECD, especially in the context of
neoliberal globalisation, reveals a persisting collective ‘world view’ amongst
member states that is underscored by a belief in democratic government and the free
market, along with a commitment to human rights. This affinity has been described
as a mode of ‘cognitive governance’ (Woodward, 2009), whereby the OECD
engenders a ‘sense of identity and community amongst its members by engineering
and propagating a sense of values, perspectives, expectations and discourses about
their place, and that of the Organisation in the global polity’ (p. 63). One might
therefore consider the OECD as an increasingly diverse community of nations, yet
one that still shares a common belief in liberal democracy as the optimal means by
which to achieve economic and social prosperity.

1
The original membership of the OEEC in 1948 included Austria, Belgium, Denmark, France,
Greece, Iceland, Ireland, Italy, Luxembourg, the Netherlands, Norway, Portugal, Sweden,
Switzerland, Turkey, the United Kingdom and Western Germany (separated into the joint
Anglo-American occupation zone and the French occupation zone).
2
As of June 2020, the 37 OECD members include Australia, Austria, Belgium, Canada, Chile,
Colombia, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary,
Iceland, Ireland, Israel, Italy, Korea, Japan, Latvia, Lithuania, Luxembourg, Mexico, the
Netherlands, New Zealand, Norway, Poland, Portugal, the Slovak Republic, Slovenia, Spain,
Sweden, Switzerland, Turkey, the UK and the U.S. It is also interesting to note here the OECD’s
interchangeable use of the terms ‘nations’ and ‘economies’ when describing political units, which
further suggests the Organisation’s strong economic leaning.
16 2 The Evolving State of the OECD and PISA

Despite its ostensive economic focus, the OECD has also embraced education as
a significant contributor to national, and indeed global, social and economic
development. However, education has only more recently emerged as having a
clearly defined location and purpose within the OECD, and it was originally only
accorded an ‘inferred role’ (Papadopoulos, 1994) that derived from the ‘human
capital’ linking of educational investment and economic productivity. The estab-
lishment of the Centre for Research and Innovation (CERI) in 1968 witnessed the
first discrete unit for education within the OECD, followed by the reconstitution of
the Committee for Scientific and Technical Personnel as the Education Committee
in 1970. Such developments reflected a growing awareness that education could
significantly influence economic growth and that education policy was, in turn, a
legitimate policy focus for the OECD, albeit one with a decidedly economistic
inflection. Although CERI and the Education Committee were initially positioned
within the Directorate for Scientific Affairs, responsibility for education was
transferred to the Directorate for Social Affairs, Manpower and Education in 1975,
implying an enhanced concern for broader social issues in relation to education.
Further structural and philosophical changes saw the unit for education renamed
the Directorate for Education, Employment, Labour and Social Affairs (DEELSA).
The constituent bodies of DEELSA sought to achieve concurrent but nonetheless
distinct aims, with the Education Committee conducting thematic reviews and
analyses of national education systems, while CERI was responsible for under-
taking research over a broad range of educational issues. A separate Directorate of
Education was only established in 2002, which acknowledged that education was
now markedly more important to the policy agendas of the OECD and its member
nations. In 2012, the Directorate was renamed the Directorate for Education and
Skills in the context of launching a new cross-committee organisational strategy,
the OECD Skills Strategy (OECD, 2012a). This pivot represented a new way of
working across policy areas in which education played a central role, and was an
attempt to ensure coherence across otherwise disparate policy domains.
Significantly, the Skills Strategy’s explicit focus on enhancing workforce partici-
pation, and improving one’s readiness (at the level of government and the indi-
vidual) to respond to the vagaries of a globalised labour market, arguably gestures
towards a particularly economistic understanding of education policy, or what has
been termed the ‘economisation’ of education (Sellar & Lingard, 2014; Spring,
2015). However, this ‘new’ strategy drew on a much longer lineage of the OECD
perceiving education as the means to enhance economic productivity; indeed, the
very first OECD conference on education, held in Washington, D.C. in October
1961 and entitled ‘Economic Growth and Investment in Education’, speaks pre-
cisely to this point.
This economistic understanding of education has since had significant impli-
cations for the types of schooling systems required by national governments, in
which the traditional purposes of education—namely, fostering intellectual, social
and cultural development—are downplayed in favour of increasing productivity
and economic growth. Noting that ‘[g]overnments will need more stress on
upgrading human capital through promoting access to a range of skills, and
The OECD and Its Education Policy Work 17

especially the capacity to learn’ (OECD, 1996, p. 7), the OECD has since posi-
tioned education as a key economic policy lever that can help determine national
success in the global ‘knowledge economy’ (see Kenway, Bullen, Fahey, & Robb,
2006; OECD, 1996). This is revealed in the Organisation’s work on the knowledge
economy and lifelong learning (Carroll & Kellow, 2011; Martens & Jakobi, 2010),
and via its proclamation that ‘skills have become the global currency of 21st-century
economies’ (OECD, 2012a, p. 10). Lingard and Sellar (2013), and Green (1999),
also argue that the growing importance of education to national economic agendas
coincides, somewhat paradoxically, with a weakening sovereignty over one’s own
economy, so that education is increasingly perceived as one of the few remaining
economic interventions available to individual countries.
Such a linking of education with economic outcomes, and a presumed absence
of ‘valid’ cross-national educational data (McGaw, 2008), led OECD members, and
particularly the U.S. in the wake of A Nation At Risk (The National Commission on
Excellence in Education, 1983), to increasingly call for objective or scientific
measures of their national schooling systems. Although preceded by earlier ini-
tiatives, such as the Indicators of Educational Systems (INES) and its publication as
the annual Education at a Glance report, a watershed moment came with the
OECD’s creation of PISA, which enabled the direct measurement of national stu-
dent performance within a framework of international comparison. First adminis-
tered in 2000, and then every 3 years thereafter, PISA focuses on the abilities of
15-year-olds, the age at which students notionally complete their final year of
compulsory schooling, across the domains of reading, mathematics and science,
thereby serving as a proxy marker for schooling system ‘effectiveness’ and the
production of ‘human capital’. PISA testing instruments are purposely designed to
avoid any alignment with national curricula, concentrating instead on competencies
that reflect the ‘important knowledge and skills needed in adult life’ (OECD, 1999,
p. 8). The rationale for such a framework is twofold: first, to emphasise the
‘real-life’ application of the specific knowledge and general skills acquired in
formal schooling; and second, so that the resulting data can be compared across
participating nations, regardless of differences in national school curricula. More
recently, PISA has also included optional assessments of ‘applied skills’, including
creative problem-solving (2012), financial literacy (2012, 2015 and 2018) and
collaborative problem-solving (2015), which demonstrates the continuing evolution
of the PISA ‘product’ (see also Lewis, 2019, 2020).
It should be noted here that PISA is not the only, nor even the first, international
assessment to evaluate and compare national schooling system performance.
Notable and long-standing alternatives include instruments designed by the
International Association for the Evaluation of Educational Achievement (IEA), a
Netherlands-based international organisation that has conducted numerous inter-
nationally comparative assessments of student ability since 1960. The IEA imple-
mented the Trends in International Mathematics and Science Study (TIMSS) in
1995 and the Progress in International Reading Literacy Study (PIRLS) in 2001,
with the most recent iterations of these assessments occurring via TIMSS 2019 and
PIRLS 2016. These initiatives have been accompanied by other ongoing IEA
18 2 The Evolving State of the OECD and PISA

surveys, including the International Civic and Citizenship Education Study (ICCS),
the International Computer and Information Literacy Study (ICILS) and the Early
Childhood Education Study (ECES). These IEA assessments have contributed,
along with PISA and related developments, to making the globe a commensurate
space of measurement of education performance. However, I would suggest that the
IEA is perhaps more careful than the OECD in terms of the policy inferences to be
drawn from their performance data, insofar as the IEA was formed by academics
interested in comparative national education performance instead of the explicit
policy learning focus of the OECD. Moreover, IEA data are based on assessments
conducted at the grade level (e.g., TIMSS measures student performance at grades 4
and 8), rather than PISA’s focus on age (i.e., 15-year-olds), which raises significant
questions around the competing objectives and methods of the two approaches (for
instance, see Prais, 2003, 2004).
Since its inception, PISA has gone from strength to strength over the course of
seven triennial surveys (2000–2018); some 79 countries and economies were sur-
veyed in PISA 2018, with fewer OECD members (36) than non-members
(43) participating, reflecting the expanding scope, scale and explanatory power of
PISA assessments and data (Sellar & Lingard, 2014). PISA has also been successful
in gaining extensive, if admittedly varied, global media coverage (see Andrews
et al., 2015; Baroutsis & Lingard, 2017; Grey & Morris, 2018; Waldow, Takayama,
& Sung, 2014). This capacity to influence national educational discourses has, in
turn, helped ensure PISA’s prominence as a source of ‘objective’ evidence in
policymaking processes globally (Fischman, Topper, Silova, Goebel, & Holloway,
2019; Lewis & Hogan, 2019; Lewis & Lingard, 2015; Rautalin, Alasuutari, &
Vento, 2019), even if participants in such discussions are potentially limited by the
discursive constraints of ‘seeing like PISA’ (Gorur, 2016). Importantly, the success
of PISA has seen it become a prototype for the OECD’s development of a range of
related educational testing initiatives. These include assessments of system-level
performance, such as PISA for Development, the Assessment of Higher Education
Learning Outcomes (AHELO), the International Early Learning and Child
Well-being Study (IELS), and the Programme for International Assessment of
Adult Competencies (PIAAC), as well as more recent iterations that address school
—(PISA for Schools) and teacher-level (PISA4U) performance and practice.
Together with the more holistic competence framework under the OECD 2030
Learning Framework initiative (OECD, 2020), with its focus on developing student
outcomes that are not currently measured by test instruments (e.g., exercising
agency, taking responsibility and showing empathy), this broad family of PISA
products arguably constitutes the central elements of the OECD’s global policy
ensemble. Through this long, if admittedly contingent, trajectory, we can see the
clear evolution of the OECD and its education work, until the Organisation has
become, arguably, the world’s leading ‘centre of calculation’ (Latour, 1987) of
comparative schooling performance and policy expertise.
PISA for Schools: The Assessment 19

PISA for Schools: The Assessment

One of the programmes that best exemplifies the evolution of PISA is the OECD’s
PISA for Schools.3 This instrument is similar in format and design to the main
PISA, comprising a two-hour written test that assesses students’ ability to apply
their acquired knowledge in reading, mathematics and science to ‘real-world’ sit-
uations. Unlike the triennial PISA test undertaken by national and subnational
schooling systems, PISA for Schools is conducted on-demand by individual schools
(up to a maximum of once per year) to assess their performance and compare
themselves against schooling systems assessed by the main PISA.4 Furthermore,
schools volunteer (and pay) to participate in the PISA for Schools assessment,
whereas the relevant national (or subnational) educational authorities may mandate
a school’s inclusion in the national sample for the main PISA. In addition to
assessing student performance, the test contains student and principal question-
naires to generate contextual information about particular ‘in-school’ and
‘out-of-school’ factors that influence student learning. These are construed in terms
of the student population, such as the socio-economic background of students,
parental occupations, and student attitudes towards their learning of reading,
mathematics and science; and the school environment, including school funding
and resourcing, student enrolment, school type (e.g., public, private, Charter) and
the organisation of school governance structures.
When conducting PISA for Schools, eligible 15-year-old students at each par-
ticipating school are randomly sampled to obtain an ideal testing cohort of between
45 and 85 students, although the test can be implemented in smaller schools with as
few as 35 students if necessary (OECD, 2017, p. 30).5 All schools within a given

3
Until recently, the school-level test was officially referred to in OECD publications and websites
as the ‘PISA-based Test for Schools’ or, alternatively in the U.S., ‘OECD Test for Schools (based
on PISA)’. Arguably, the recent 2018 nomenclature shift to PISA for Schools aligns the
school-level test to the naming pattern of other PISA-derived OECD programmes: PISA for [insert
name]. These include PISA for Development (a system-level PISA for developing economies) and
PISA4U (an online professional development and credentialing programme for individual teach-
ers). We can also observe informal references within (and outside) the OECD to its Programme for
the International Assessment of Adult Competencies (PIAAC) as ‘PISA for Adults’, and the
International Early Learning and Child Well-being Study (IELS) as ‘PISA for Five-Year-Olds’.
This naming pattern also suggests the OECD fully apprehends the power of PISA as a brand, and
the presumed benefits for new products that invoke the long-established and globally recognised
moniker.
4
For instance, a US school that participated in PISA for Schools in 2018 would have their perfor-
mance benchmarked against 16 schooling systems: Australia, Brazil, B-S-J-G [Beijing, Shanghai,
Jiangsu, Guangdong] (China), Canada, Finland, France, Germany, Hong Kong (China), Ireland,
Japan, Korea, the Netherlands, Russia, Singapore, the United Kingdom and the United States.
5
For the purpose of PISA for Schools, ‘15-year-old students’ are considered to be those aged from
15 years and 3 completed months to 16 years and 2 completed months at the time of the
assessment being administered, with a maximum permissible variance of 1 month (OECD, 2017,
p. 28).
20 2 The Evolving State of the OECD and PISA

national or subnational jurisdiction are eligible to undertake the PISA for Schools
test, provided that they meet the minimum sampling requirements in terms of
student population size. The pool of eligible students is further stratified by gender
(male, female) and school grade to ensure an adequately representative sample is
included in PISA for Schools testing. However, individual schools are permitted to
conduct within-school exclusions of certain students at their discretion, including
students who have ‘a mental or emotional disability’; ‘functionally disabled stu-
dents’ who are ‘permanently physically disabled’; and ‘students with insufficient
assessment language experience’ (OECD, 2017, p. 29). Initially administered as a
pencil-and-paper test, the OECD issued a call for tender in June 2015 that invited
proposals for prospective accredited providers to develop an online version of PISA
for Schools, which followed similar moves to develop an ICT delivery platform for
the main PISA.
Development of the programme began in 2010, with schools and districts invited
by the OECD in late 2011 to participate in a pilot study. This was designed to equate
the new school-based test with the main PISA so that direct comparisons could be
made between school (PISA for Schools) and schooling system (main PISA) per-
formance. PISA for Schools test items were developed according to the relevant
PISA assessment frameworks for reading, mathematics and science (see OECD,
2013), and equated to the existing PISA scales (Level 1–Level 6) by simultaneously
anchoring them with the main PISA ‘link items’ against a common PISA metric.6
This process enables PISA for Schools scores for reading, mathematics and science
to be reported against the established PISA proficiency scales, and against the
performance of schooling systems as measured by the main PISA. The pilot was
conducted from May to October of 2012, including a total of 126 secondary schools
across the U.S. (105 schools), the UK (18) and the Canadian province of Manitoba
(3), with a later Spanish pilot of 225 schools that conducted during 2013. Following
a successful field trial, PISA for Schools was officially launched in the US in April
2013, and made available to all eligible schools and districts throughout the country.
Since this time, PISA for Schools has experienced a significant expansion in terms of
its availability and administration. As of December 2020 PISA for Schools is
available in 12 languages across 14 countries, and it has been cumulatively
administered in more than 2200 schools globally (OECD, 2018b).7

6
The three domains of reading, mathematics and science are assessed in the main PISA and PISA
for Schools via an ascending six-level PISA proficiency scale (Level 1–Level 6), with Level 2
considered to be equivalent to a baseline level of student proficiency in the given subject, whereas
students at Levels 5 and 6 are notionally ‘top performers’ when compared with their global peers.
Given the equating between PISA and PISA for Schools, these PISA proficiency levels and scores
putatively provide a common framework for comparing student performance at the local (school)
and international (schooling system) level.
7
PISA for Schools is now available in the following jurisdictions: Andorra, Brazil, Brunei
Darussalam, China (PRC), Colombia, Japan, Pakistan, Portugal, Russia, Spain, Thailand, the
United Arab Emirates, the U.K. and the U.S. It is deliverable in the following languages: Arabic,
Basque, Catalan, English, Galician, Japanese, Mandarin Chinese, Portuguese, Russian, Spanish,
Thai and Welsh.
PISA for Schools: The Assessment 21

Table 2.1 PISA for schools test items by subject and response type
Total Total number of items by response type
number Constructed Constructed Complex Simple Items
of items response response multiple-choice multiple-choice that
by expert items manual items items are
subject items not
scored
Reading 47 17 4 7 18 1
Mathematics 40 7 19 3 11 0
Science 54 20 0 16 18 0
Total 141 44 23 26 47 1
Source OECD (2017, p. 15)

The OECD’s (2017) Technical Report reveals how the PISA for Schools
reading, mathematics and science items are arranged in the question booklets
received by students. The question booklets draw upon a pool of 141 possible test
items (47 reading items, 40 mathematics items and 54 science items) developed by
the Australian Council for Educational Research (ACER), with items being of
either a ‘selected response’ (e.g., simple/complex multiple-choice questions) or
‘constructed response’ (e.g., manual/expert short response) format (see Table 2.1).
These 141 test items are then formed into different units aligned around a
common theme or stimulus (e.g., text, graphs, tables and diagrams), with between
one and five test items assigned per unit. In total, 13 units of reading assessment, 25
units of mathematics assessment and 25 units of science assessment comprise the
PISA for Schools item pool. Finally, these units (and their items) are divided into
seven possible item clusters—two each of solely reading (R1, R2), mathematics
(M1, M2) and science (S1, S2) items, and one cluster composed of items compiling
all three subjects (RSM)—that are arranged into seven different booklet configu-
rations (see Table 2.2).
Each of the three-cluster test booklets thus has a unique combination of the three
subject clusters (reading, mathematics, science) and test items, with these seven
possible booklets randomly assigned to the students selected as part of the school
sample. This matrix sampling technique means that all of the 141 PISA for Schools

Table 2.2 Possible combinations of subject clusters in the PISA for Schools test booklets
Test booklet ID number Cluster one Cluster two Cluster three
1 R1 RSM M1
2 RSM M2 S2
3 M2 M1 R2
4 M1 S2 S1
5 S2 R2 R1
6 R2 S1 RSM
7 S1 R1 M2
Source OECD (2017, p. 13)
22 2 The Evolving State of the OECD and PISA

assessment items are completed across the breadth of the 85-student sample,
although with each student answering far fewer questions in significantly less time
(i.e., 2h of testing rather than 490 min to complete every item).

PISA for Schools: The School Report

All schools participating in PISA for Schools receive a 160-page report from the
national accredited provider containing analyses of their students’ performance and
contextual data, 17 examples of best practices from high-performing international
schooling systems (e.g., Shanghai-China, Finland, Singapore), and excerpts from
the OECD’s broader educational policy research. The inclusion of these policies,
practices and evidence is notionally to ‘encourage school staff and local educators
to look beyond their classrooms in search of national and global excellence’
(OECD, 2012b, p. 4), and to help them identify ‘what works’ (OECD, 2013, p. 5).
Regardless of the country in which PISA for Schools is implemented, the OECD
retains the final authority over the structure and layout of the school reports, and the
national accredited provider must agree to follow the authorised report template.
However, and aside from graphs and tables representing a school’s specific data
around student performance or local contextual factors, the remainder of the
160-page report is otherwise entirely identical for all participating schools within
the same national jurisdiction (e.g., the U.S.).
For instance, the 17 prominent ‘breakout boxes’ (i.e., physically distinct and
highlighted sections of text within the PISA for Schools report) of best practice
within the report, as well as the excerpts from other OECD research publications,
are identical for all US schools, and there are no modifications to the report contents
to acknowledge a school’s specific context (e.g., whether a school is high/low
performing on PISA for Schools). Although the issue of ‘best practice’ will be
addressed more substantively in Chap. 6, it is significant to note here that the same
17 best practice ‘breakout boxes’ are present in the reports received by participating
schools in the U.S. and the UK (see Lewis, 2017a). This arguably promotes the
logic that all schools—irrespective of their local context, including whether they are
in the U.S. or the UK—require the same OECD policy lessons, simply by virtue of
their having participated in PISA for Schools. Although the OECD has more
recently sought to discourage teachers from identifying and importing ‘prefabri-
cated solutions’,8 I would still argue that the very inclusion of these practices,
let alone the encouragement to compare oneself against high-performing systems
via PISA for Schools, incentivises schools to look around for, and presumably
borrow, examples of effective policies.

8
For instance, a statement by Andreas Schleicher of the OECD in a 2018 brochure advertising the
PISA for Schools notes that ‘this is not about copying prefabricated solutions from other places; it
is about looking seriously and dispassionately at good practice in our own environment and
elsewhere to become more knowledgeable about what works and in which contexts’ (OECD
2018a, p. 1; emphasis added).
PISA for Schools: The School Report 23

Moreover, any adaptations to the school report template, including translation


into languages other than English or the modification of tables or figures, must first
be submitted by the national accredited provider to the OECD for approval before
being released to participating schools. This form of overseeing or control by the
OECD reflects a significant development in its ‘ways of working’, given that PISA
for Schools is the result of collaborations between the OECD and its various partner
organisations, including philanthropic foundations, edu-businesses and various
not-for-profit agencies. However, while such networked, or ‘heterarchical’ (Ball,
2012; Ball & Junemann, 2012) modes of policy-making position the OECD as but
one node in a transnational policy network, the Organisation still arguably retains
the dominant steering role, and ultimate control of message, amongst its partners
(Lewis, 2017b). This is in terms of the OECD determining both how PISA for
Schools is administered, and, perhaps more importantly in terms of governance
implications, how schooling and performance are discursively constructed within
the assessment and report.
As such, all school report s in all countries and economies must adhere to the
same general report structure:
• Summary of your school’s results
• Reader’s guide
• Section 1: Introduction: Understanding your school’s results
• Section 2: What students at your school know and can do in reading, mathe-
matics and science
• Section 3: The learning environment and student engagement at your school
• Section 4: Your school’s results in an international context
• Section 5: Excellence and equity at your school (OECD, 2017, p. 93).
The first section of the report—Summary of your school’s results—provides an
overview of local student performance in reading, mathematics and science as
measured by PISA for Schools, with these data represented in a variety of ways.
These include the school’s mean score (with standard error) for each subject; the
distribution of the school’s students across the six PISA proficiency levels, divided
into ‘below baseline level’ (Level 1 and below), ‘intermediate levels’ (Levels 2–4)
and ‘top levels’ (Levels 5 and 6); and a comparison of the school’s mean perfor-
mance score for reading, mathematics and science against the performance of
different international schooling systems on PISA 2012. For instance, a US high
school currently participating in PISA for Schools is compared against all of the
schools in the US national sample, as well as against 11 other schooling systems
assessed in PISA 2012. Importantly, all of these comparative schooling systems are
designated as ‘strong performers’ or ‘successful reformers’ (OECD, 2012b) on the
basis of their performance on the main PISA test.
24 2 The Evolving State of the OECD and PISA

These individual school-level results are further elaborated in subsequent sec-


tions of the PISA for Schools report to provide detailed comparisons of student
performance, and student and school contextual factors (e.g., socio-economic
background, student attitudes to learning), against similar measurements made of
national schooling systems on the main PISA. The comparisons with international
schooling systems contained within the PISA for Schools report have been
notionally included to prompt educators and school leaders to initiate local school
improvement agendas. Given that so many of the participating US schools to date
are within affluent communities and are themselves (relatively) high performing, a
key motivation to undertake PISA for Schools has arguably also been to emphasise
these schools’ high levels of student (and hence school) performance. While these
motivations and school-level responses are admittedly not my substantive focus
here, it is still worth acknowledging the diverse motivations for schools to engage
with PISA for Schools, ranging from being more reform-oriented and educative to,
arguably, the more normative promotional intentions of some high-performing
schools and districts.
Despite the main PISA clearly informing the development of PISA for Schools,
the two assessments still retain significant distinctions. Table 2.3 summarises the
main similarities and differences between system and school-based PISA. Funding
for the main PISA is from the contributions of participating member (and
non-member) nations via the Part II OECD budget, while the development of PISA
for Schools was exclusively paid for by US philanthropic foundations (also see
Chap. 4). Furthermore, PISA for Schools participants must pay a fee to nationally
accredited organisations (National Service Provider, or NSP) to administer the test,
analyse the data and produce the school report. Although the main PISA is con-
ducted regularly every 3 years, PISA for Schools can instead be timed at the
discretion of individual schools and districts annually, so long as this does not
interfere with their participation in PISA as part of a national sample. Although the
OECD organises a globally choreographed release of performance data and rank-
ings for the main PISA, known as ‘PISA Day’ in the U.S., PISA for Schools
participants make their own decisions relating to the release (or otherwise) of their
performance data. Moreover, the data generated by main PISA is intended to inform
public discussions and debates around national schooling performance; on the other
hand, PISA for Schools is much more locally focused, with school-level data
largely intended to drive school-level reflection and reform processes. This means,
in theory at least, that no school-level performance ‘league tables’ can be con-
structed and publicised from the assessment, which implies that PISA for Schools is
less likely to be used to publicly shame schools and hold them to account.
Table 2.3 Comparing the OECD’s main PISA and PISA for schools
Modes of Sample Funding Purposes Analysis and Putative usage Performance Frequency Release of test Tests Contextual
accountability reporting comparisons data and questionnaires
produced reports

Main ‘Top-down’ National (and OECD Part II National comparison OECD Public Nation-to-nation Every 3 years; Coordinated Made Made
PISA Possible punitive subnational) funding: National policy Secretariat National policy and subnational scheduling global release equivalent for equivalent for
policy effects for representative voluntary decision-making and decision-making; Some subnational determined by determined by comparative comparative
‘low-performing’ sample payments from contracted ‘Taking the comparisons the OECD the OECD purposes, in purposes
national and Randomly participating international national within nations Public release terms of the across the
subnational selected from nations and consortia temperature’ (e.g., Australia, (e.g., ‘PISA PISA scales and contextual
schooling systems suitable economies Externalisation Canada) Day’ in the U. proficiency questionnaires
national S.) levels (student and
schools While the main principal
PISA for Schools: The School Report

PISA ‘Bottom-up’ School-level Scoping study School-level Providers Private School-to-school As desired by As determined PISA focuses surveys)

for Potential reward representative and pilot: US comparison accredited by School-level within district, school and if chosen or on a different

Schools form of sample philanthropic Implementation-level/ the OECD policy state and nation; district users, required by ‘major’ domain

accountability for 85 students funding local (school/district) Mostly decision-making School-to-national provided there is the user every 3 years,

‘high-performing’ will generally Ongoing policy private and reform; and subnational no direct overlap NB. The PISA for

schools and be the sample administration: decision-making for-profit or ‘Taking the (domestic) or interference conditions of Schools

districts size for each ‘user pays’ not-for-profit school-level School-to-national with the timing some represents all

participating (schools and organisations temperature’; and subnational of the main PISA philanthropic three domains

school and no systems) at Accredited Establishing (international); in their country funding to (reading, math,

fewer than 45 $5000 (US) per providers global standards Future possibility or economy subsidise user science)

students in the school to receive and reputation of school-to-school participation


case of smaller provider payment via Externalisation (international) mandate the
schools Philanthropic school and public release
funding to participation indigenisation of school data
develop and and reports
replenish test
instruments,
and to subsidise
participation
25
26 2 The Evolving State of the OECD and PISA

Conclusion

This chapter has provided an overview of the OECD, and especially how it has
more recently evolved to become a ‘global expert’ on matters of education policy
and comparative assessment. It has documented the origins of PISA, its dominant
positioning within education discourses and policymaking globally, and how PISA
has become, in effect, a prototype for the development of other PISA-related
products, including PISA for Schools. This has enabled the continuing evolution of
PISA and the ‘PISA brand’, and the expansion of the OECD into new audiences
and markets, both within and (importantly) beyond its traditional clientele of
member states. Such developments have helped position the OECD as a dominant
education policy actor in its own right, with PISA for Schools extending the
OECD’s already considerable policy ‘reach’ to new school-level spaces and actors.

References

Andrews, P., Atkinson, L., Ball, S., Barber, M., Becket, L., Beradi, J. … Zhao, Y. (2015, May 6).
OECD and PISA tests are damaging education worldwide - academics. The Guardian.
Retrieved from http://www.theguardian.com/education/2014/may/06/oecd-pisa-tests-
damaging-education-academics. Access date: 19/01/2020.
Ball, S. (2012). Global education inc.: New policy networks and the neo-liberal imaginary. New
York: Routledge.
Ball, S., & Junemann, C. (2012). Networks, new governance and education. Bristol: Policy Press.
Baroutsis, A., & Lingard, B. (2017). Counting and comparing school performance: An analysis of
media coverage of PISA in Australia, 2000-2014. Journal of Education Policy, 32(4), 432–
449. https://doi.org/10.1080/02680939.2016.1252856.
Carroll, P., & Kellow, A. J. (2011). The OECD: A study of organisational adaptation.
Cheltenham: Edward Elgar.
Fischman, G. E., Topper, A. M., Silova, I., Goebel, J., & Holloway, J. (2019). Examining the
influence of international large-scale assessments on national education policies. Journal of
Education Policy, 34(4), 470–499. https://doi.org/10.1080/02680939.2018.1460493.
Gorur, R. (2016). Seeing like PISA: A cautionary tale about the performativity of international
assessments. European Educational Research Journal, 15(5), 598–616. https://doi.org/10.
1177/1474904116658299.
Green, A. (1999). Education and globalisation in Europe and East Asia: Convergent and divergent
trends. Journal of Education Policy, 14(1), 55–71. https://doi.org/10.1080/026809399286495.
Grey, S., & Morris, P. (2018). PISA: Multiple ‘truths’ and mediatised global governance.
Comparative Education, 54(2), 109–131. https://doi.org/10.1080/03050068.2018.1425243.
Henry, M., Lingard, B., Rizvi, F., & Taylor, S. (2001). The OECD, globalisation and education
policy. Oxford: IAU Press.
Kenway, J., Bullen, E., Fahey, J., & Robb, S. (2006). Haunting the knowledge economy. Oxon:
Routledge.
Latour, B. (1987). Science in action: How to follow scientists and engineers through society.
Cambridge, MA: Harvard University Press.
Lewis, S. (2017a). Governing schooling through ‘what works’: The OECD’s PISA for Schools.
Journal of Education Policy, 32(3), 281–302. https://doi.org/10.1080/02680939.2016.1252855
.
References 27

Lewis, S. (2017b). Policy, philanthropy and profit: The OECD’s PISA for Schools and new modes
of heterarchical educational governance. Comparative Education, 53(4), 518–537. https://doi.
org/10.1080/03050068.2017.1327246.
Lewis, S. (2019). Historicising new spaces and relations of the OECD’s global educational
governance: PISA for Schools and PISA4U. In C. Ydesen (Ed.), The OECD’s historical rise in
education: The formation of a global governing complex (pp. 269–289). Cham: Palgrave
Macmillan. https://doi.org/10.1007/978-3-030-33799-5_13.
Lewis, S. (2020). Providing a platform for ‘what works’: Platform-based governance and the
reshaping of teacher learning through the OECD’s PISA4U. Comparative Education, 1–19.
https://doi.org/10.1080/03050068.2020.1769926.
Lewis, S., & Hogan, A. (2019). Reform first and ask questions later? The implications of (fast)
schooling policy and ‘silver bullet’ solutions. Critical Studies in Education, 60(1), 1–18.
https://doi.org/10.1080/17508487.2016.1219961.
Lewis, S., & Lingard, B. (2015). The multiple effects of international large-scale assessment on
education policy and research. Discourse: Studies in the Cultural Politics of Education, 36(5),
621–637. https://doi.org/10.1080/01596306.2015.1039765.
Lingard, B., & Sellar, S. (2013). Globalisation, edu-business and network governance: The policy
sociology of Stephen J. Ball and rethinking education policy analysis. London Review of
Education, 11(3), 265–280. https://doi.org/10.1080/14748460.2013.840986.
Martens, K., & Jakobi, A. (2010). Expanding and intensifying governance: The OECD in
education policy. In K. Martens & A. Jakobi (Eds.), Mechanisms of OECD governance:
International incentives for national policy-making? (pp. 163–179). Oxford: Oxford University
Press.
McGaw, B. (2008). The role of the OECD in international comparative studies of achievement.
Assessment in Education: Principles, Policy & Practice, 15(3), 223–243. https://doi.org/10.
1080/09695940802417384.
OECD. (1996). The knowledge-based economy. Paris: OECD Publishing.
OECD. (1999). Measuring student knowledge and skills: A new framework for assessment. Paris:
OECD Publishing.
OECD. (2012a). Better skills, better jobs, better lives: A strategic approach to skills policies. Paris:
OECD Publishing.
OECD. (2012b). How your school compares internationally: OECD Test for Schools (based on
PISA) pilot trial [US version]. Paris: OECD Publishing.
OECD. (2013). The OECD Test for Schools (based on PISA): Questions and answers (US
version). Paris: OECD Publishing.
OECD. (2017). PISA-based Test for Schools: Technical report 2016. Paris: OECD Publishing.
OECD. (2018a). PISA for Schools brochure. Retrieved from http://www.oecd.org/pisa/pisa-for-
schools/PISA%20for%20Schools%20Brochure.pdf. Access date: 19/01/2020.
OECD. (2018b). PISA-based Test for Schools: FAQs. Retrieved from http://www.oecd.org/pisa/
aboutpisa/pisa-based-test-for-schools-faq.htm. Access date: 19/01/2020.
OECD. (2019a). 7th OECD roundtable of mayors and ministers. Retrieved from https://www.oecd.
org/urban/roundtable/partners/. Access date: 19/01/2020.
OECD. (2019b). Convention on the Organisation for Economic Co-operation and Development.
Retrieved from http://www.oecd.org/general/conventionontheorganisationforeconomicco-
operationanddevelopment.htm. Access date: 19/01/2020.
OECD. (2020). OECD Future of Education and Skills 2030. Retrieved from http://www.oecd.org/
education/2030-project/teaching-and-learning/learning/faq/. Access date: 04/06/2020.
Papadopoulos, G. (1994). Education 1960-1990: The OECD perspective. Paris: OECD Publishing.
Prais, S. J. (2003). Cautions on OECD’S recent educational survey (PISA). Oxford Review of
Education, 29(2), 139–163. https://doi.org/10.1080/0305498032000080657.
Prais, S. J. (2004). Cautions on OECD’s recent educational survey (PISA): Rejoinder to OECD’s
response. Oxford Review of Education, 30(4), 569–573. https://doi.org/10.1080/
030549804200030303017.
28 2 The Evolving State of the OECD and PISA

Rautalin, M., Alasuutari, P., & Vento, E. (2019). Globalisation of education policies: Does PISA
have an effect? Journal of Education Policy, 34(4), 500–522. https://doi.org/10.1080/
02680939.2018.1462890.
Rinne, R., Kallo, J., & Hokka, S. (2004). Too eager to comply? OECD education policy and the
Finnish response. European Educational Research Journal, 3(2), 454–485. https://doi.org/10.
2304/eerj.2004.3.2.3.
Sellar, S., & Lingard, B. (2014). The OECD and the expansion of PISA: New global modes of
governance in education. British Educational Research Journal, 40(6), 917–936. https://doi.
org/10.1002/berj.3120.
Spring, J. (2015). Economisation of education: Human capital, global corporations, skills-based
schooling. New York: Routledge.
The National Commission on Excellence in Education. (1983). A nation at risk: The imperative for
educational reform. Retrieved from https://www2.ed.gov/pubs/NatAtRisk/risk.html. Access
date: 19/01/2020.
Waldow, F., Takayama, K., & Sung, Y.-K. (2014). Rethinking the pattern of external policy
referencing: Media discourses over the ‘Asian Tigers’ PISA success in Australia. Germany and
South Korea. Comparative Education, 50(3), 302–321. https://doi.org/10.1080/03050068.
2013.860704.
Woodward, R. (2009). The Organisation for Economic Co-operation and Development (OECD).
Oxon: Routledge.
Ydesen, C. (Ed.). (2019). The OECD’s historical rise in education: The formation of a global
governing complex. Cham: Palgrave Macmillan.

View publication stats

You might also like