Professional Documents
Culture Documents
Learning Potential and Cognitive Modifiability
Learning Potential and Cognitive Modifiability
Practice
Alex Kozulin
To cite this article: Alex Kozulin (2011) Learning potential and cognitive modifiability,
Assessment in Education: Principles, Policy & Practice, 18:2, 169-181, DOI:
10.1080/0969594X.2010.526586
The relationship between thinking and learning constitutes one of the fundamental
problems of cognitive psychology. Though there is an obvious overlap between
the domains of thinking and learning, it seems more productive to consider
learning as being predominantly acquisition while considering thinking as the
application of the existent concepts and strategies to new tasks. This distinction
acquires a new meaning in the context of the dynamic assessment (DA) approach,
which has often used the notions of learning potential and cognitive modifiability
interchangeably. An illustrative assessment study of 88 primary school students
shows that not all students who demonstrate high learning potential reveal a
propensity toward cognitive modifiability and vice versa. Educational
implications for the design, selection and application of DA procedures are
discussed.
Keywords: cognition; learning; dynamic assessment; Raven Coloured Matrices
The relationship between thinking and learning constitutes one of the fundamental
problems of cognitive psychology. The analysis of this relationship has been compli-
cated, however, both by a multitude of overlapping concepts such as ‘intelligence’,
‘cognition’, ‘experience’, and ‘problem solving’, and by the tendency of researchers
to use many of these terms interchangeably. For example, Slavin (1994, 134) defines
intelligence as a ‘general aptitude for learning’, while Beckmann (2006) vacillates
between defining learning as one of the important aspects of intelligence and defin-
ing intelligence as the ability to learn. At the same time Sternberg (2010) in his
Encyclopaedia Britannica entry suggests that: ‘In psychology, the term [intelligence]
may more specifically denote the ability to apply knowledge to manipulate one’s
environment or to think abstractly as measured by objective criteria’. In the same
Encyclopaedia Britannica, Berlyn (2010) asserts that:
The problem becomes even more complicated because in practice we derive our
judgments about thinking, learning or intellective processes from data obtained with
the help of usually rather restrictive assessment techniques. For example, a child’s
performance with WISC-R tasks (Wechsler 1974) is interpreted as a measure of his or
her intelligence because WISC-R is defined as an intelligence test. At the same time
*Email: kozulin@post.tau.ac.il
know how much time he or she invested in their learning. As a result the final product
– vocabulary score – may then be erroneously interpreted as reflecting the student’s
learning ability.
The fact that thinking and learning processes are interconnected does not mean
that there is no advantage in differentiating between them. Moreover, I believe and
will attempt to show that such a distinction may have both a theoretical significance
and a direct bearing on our interpretation of various psychological tests and assess-
ment procedures. This distinction seems to be particularly important when we leave
the traditional psychometric approach with its reliance on static tests and advance
toward the emerging paradigm of the so-called dynamic assessment (DA) (Feuerstein,
Rand, and Hoffman 1979; Lidz 1987; Sternberg and Grigorenko 2002; Haywood and
Lidz 2007). In DA, students’ learning experience becomes an integral part of the
assessment procedure. One may then ask whether the introduction of the learning
phase shifts the goal of assessment to the identification of learning processes, or
whether these processes serve as a means for better assessment of students’ thinking.
In other words, the question is whether the DA aims at discovering the modifiability
of students’ thinking or their learning potential.
standard psychometric tests, an example is usually given, but it is neither analysed nor
explained. Models are absent and cues, when existent, must usually be discovered by
the problem solvers themselves. The following item is quite typical of a standard
psychometric school ability test (e.g. Otis and Lennon 1996):
Though the mathematical content of the task is quite basic, a student needs a consid-
erable integrative ability to solve it. First of all, linguistic skills should be deployed so
that the text is decoded and translated into the numerical modality. Then the problem
itself should be identified as that of equivalence between a certain (unknown) number
multiplied by two and a quarter of 40. The student is then expected to find a relevant
strategy for finding this number. One of the strategies would be to start by finding out
how much is a quarter of 40 (40:4 = 10 or 1/4 x 40 = 10) and then going systematically
over all the given answers, multiplying each one of them by 2 while checking which
of the results equals 10. A good problem solver may actually substitute the operation
of multiplication (x 2) with that of division (10:2 = 5) and in this way arrive at the
correct answer in a more efficient way. A poor problem solver may start by multiply-
ing by two all given answers one by one and writing down the intermediate results,
then finding out how much is a quarter of forty, and then comparing the intermediate
results with 10. At the end of the problem-solving process students are expected to
check their answers.
Let us now check how the same type of task appears in the acquisition context, e.g.
a mathematics textbook. Students are usually given a general model or a series of
examples and a direct explanatory text. For example, the introduction of the notion of
ratios often appears in the following way:
A ratio is a quotient of one number (or variable) A to another number (or variable) B. It
can be expressed in three ways: 1) A:B, 2) A/B; 3) A÷B. The ratio A:B is read as A to B.
Example 1.
Write down the ratio of 2 hours: 30 minutes in simplest form. Answer - 2 hours: 30 min
= 120 min: 30 min = 4:1.
Example 2.
A rectangle has the length of 2 meters and the width of 1 meter 20 cm. Find the ratio of
length to width. Answer – Length: Width = 200 cm : 120 cm = 5:3.
This point was strongly emphasised by Brown and Ferrara (1985) in their study
of learning and problem solving of primary school children. The children were
offered a series completion learning task, e.g. N G O H P I Q J _ _ _ _ (answer: R K
S L). Standardised prompts were provided by the tester, starting with general hints
regarding the need to look for patterns, which then became more explicit in directing
the child’s attention to the important parameters of the task. The final hints effec-
tively provided the child with the solution of the problem. Brown and Ferrara opera-
tionalised the speed of learning as the number of standardised prompts needed for
reaching the criterion. They then classified the students as ‘slow’ or ‘fast’ learners on
the basis of the median number of prompts required. The authors also obtained
students’ IQ scores and on the basis of them classified the students into ‘average IQ’
(mean 101) and ‘high IQ’ (mean 122) categories. If problem-solving ability as
reflected in the IQ scores were indistinguishable from the learning ability, then all
‘high IQ’ students would have been ‘fast learners’, and all ‘average IQ’ students
‘slow learners’. The results, however, demonstrated that ‘a good third of the children
had learning speed not predictable from their IQ scores’ (Brown and Ferrara 1985,
288). In other words, though the majority of ‘high IQ’ students indeed turned out to
be fast learners, some demonstrated slow learning. On the other hand, some of the
fast learners turned out to have a lower IQ.
nature and their development depends on the involvement of children in joint activity
with adults and the appropriation of symbolic tools available in a given society. In such
a theoretical context the assessment of emerging mental functions requires the situation
of joint activity of children and adults. The difference between children’s already
established mental functions and the emergent ones is conceptualised through the
notion of the Zone of Proximal Development – ZPD (see Chaiklin 2003). Vygotsky
thus introduced the following aspects that play an important role in the DA approach:
(1) active interaction between adults and children during the assessment; (2) emphasis
on emerging rather than already established mental functions; and (3) comparison
between individual and aided performance as a measure of the child’s ZPD.
The use of ZPD as a theoretical basis for DA, however, was complicated by the
fact that ZPD had been described by Vygotsky (1934/1986; 1935/1996) in three
different albeit interrelated contexts. The first context is developmental and focuses on
the question of how to investigate the emergent mental functions which are not fully
developed and as such cannot be observed in the child’s independent activity. The
second context is related to assessment and points to the fact that children who have
the same standard intelligence scores may demonstrate different ZPDs explored under
conditions of joint activity with adults. The third context is educational and focuses
on the interaction between adults’ ‘academic’ concepts and children’s spontaneous
concepts. Vygotsky believed that for education to be effective the learning process
should take place in the children’s ZPD; learning and instruction should be used as a
‘motor’ of the child’s mental development. Additional complication was created by
the fact that Vygotsky made several general suggestions on how to explore ZPD but
left no specific assessment technique. All these difficulties notwithstanding, the
concept of ZPD seems to offer a perspective that may help to identify mental devel-
opment and cognitive modifiability as a target of DA.
In Vygotsky’s (1935/1996) description of ZPD research, children’s joint activity
with adults usually assumes the form of presenting a model or worked-out example
and providing cues, or starting the problem-solving process and then asking children
to continue. All these activities more or less correspond to the learning phase of DA
as it is currently practised (see Haywood and Lidz 2007). Vygotsky’s goal, however,
was not to determine the child’s facility with these learning prompts, but to use them
as a means of viewing the child’s emergent mental functions. Functions emergent at
one developmental age under favourable conditions are actualised during the next
age (see Chaiklin 2003). In other words, evaluation of the child’s ZPD allows one to
imagine his or her thinking as it will appear two to three years later. In this sense
ZPD is related to the task of exploring children’s mental development or cognitive
modifiability, rather than immediate learning potential. The latter point in no way
detracts from the value of the concept of ZPD for the theory of ‘learning activity’
developed by Vygotsky’s followers in Russia (see Zuckerman 2003). The fact that
specially designed learning activities promote students’ mental development does not
mean that there is no need to distinguish between learning and thinking components
of intelligence.
Already in these early attempts of Rey and Vygotsky one can discern two perspec-
tives: one aimed at creating procedures that would allow the evaluation of children’s
learning potential, and the other attempting to establish a new approach to the assess-
ment of thinking, an approach that instead of testing the existent problem-solving
skills, aims to determine the degree of modifiability of thinking. The learning potential
assessment sets as its goal the evaluation of children’s ability to benefit from models,
Assessment in Education: Principles, Policy & Practice 175
cues, and examples during the performance of the learning tasks. In the above-
mentioned Brown and Ferrara (1985) study, learning potential was defined by a mini-
mal number of cues that students needed to solve the tasks. Learning potential estab-
lished in this way may help to predict students’ performance in future learning
situations. For example, the efficiency of learning the rules of an artificial language
through a series of samples may serve as an indicator of the students’ potential for
learning foreign languages (Sternberg and Grigorenko 2002).
On the other hand, DA of thinking focuses on the modifiability of a wide range of
cognitive functions of students and/or their readiness for the transition from one cogni-
tive-developmental stage to the next one. Thus Feuerstein, Rand, and Hoffman (1979)
explicitly set up as one of the major goals of their DA: ‘the extent of examinee’s modi-
fiability in terms of levels of functioning made accessible to him by the process of
modification, and the significance of the levels attained by him in the hierarchy of
cognitive functions’ (91). The latter point refers to the question of whether the
observed change enables the student to acquire abstract thinking and logical operations
in a variety of domains. For example, in their study of analogical reasoning Feuerstein,
Rand, and Hoffman (1979, 384–99) inquired to what extent a systematic mediation of
comparison, differentiation, and classification of verbal and figural material helps
students to ascend to the level of analogical reasoning in various modalities.
Karpov and Gindis (2000) also seem to view the transition to a higher cross-
domain cognitive level as a criterion of cognitive modifiability. In their research such
a criterion was operationalised as children’s transition from visual motor to visual
imaging to the symbolic level of problem solving. In this way Karpov and Gindis
emphasised the cross-domain nature of cognitive modifiability and its interpretation
as a readiness for transition to a higher cognitive-developmental level.
It thus seems appropriate to identify the restructuring of a wide range of cognitive
functions as a sign of cognitive modifiability, and the efficiency of using such learning
devices as models, prompts, and cues for the solution of a more restricted range of
tasks, as indicative of students’ learning potential.
In spite of the above-mentioned tendencies for a differential approach to the task
of learning potential assessment and the task of assessing modifiability of thinking,
the majority of DA research deliberately or unwittingly obliterated this distinction.
For example Budoff (1987) started his argument in favour of DA by showing that IQ
tests cannot serve as a reliable tool for the identification of mentally retarded children
and the evaluation of their intellectual abilities. He then proceeded to demonstrate
that the inclusion of the learning phase in the assessment procedure, employing such
standard intelligence tests as Kohs cubes and Raven Matrices, helps to distinguish
between ‘gainers’ who benefited from learning-within-the-test opportunity and ‘non-
gainers’ who did not. By any standard Budoff’s procedure should be defined as aimed
at discovering the students’ learning potential. Nevertheless, Budoff interpreted his
results as bearing on the impaired or non-impaired intelligence of his subjects rather
than on their potential for learning. It is possible that such confusion originated in
Budoff’s very general definition of intelligence as ‘the ability to benefit from experi-
ence’ (1987, 55).
More recently Beckmann (2006, 36) claimed that dynamic testing, ‘primarily
focuses on psychometric attempts to obtain diagnostic information about a person’s
learning potential, learning ability, intellectual change potential, reserve capacity, and
so on’. Here all possible targets of DA are included without any discrimination. Little
wonder that DA researchers and practitioners still lament the lack of clarity in DA’s
176 A. Kozulin
objectives and methods (Karpov and Tzuriel 2009; Hessels-Schlatter and Hessels
2009).
Haywood and Lidz (2007) claimed that DA data may respond to a variety of ques-
tions. Some of them, according to the present point of view, are related to learning
while some are related to thinking. Thus, when discussing the question ‘What is the
response to intervention?’, Haywood and Lidz seem to focus on students’ learning
potential. On the other hand their discussion of ‘How much investment, of what kinds,
may be required to promote long-term gains in performance?’ seems to be related to
the question of cognitive modifiability (2007, 14).
The present author (Kozulin, Kaufman, and Lurie 1997) also did not escape the
same confusion. In a study of new immigrant students he made practically no distinc-
tion between their learning potential as revealed through performance with learning
tasks and the students’ cognitive modifiability as inferred from the cognitive change
observed after implementation of a year-long cognitive enrichment programme.
In other words, though a theoretical basis has existed for distinguishing between
DA of learning potential and DA of cognitive modifiability, DA practitioners have
systematically conflated these two goals and as a result obscured the objectives and
methods of DA.
mediation. Then mediation of Set Variations is performed and students are tested a
week later, again using Raven Coloured Matrices. The pre- to post-test change in
Raven scores is considered to reflect students’ cognitive modifiability. Because only
5 out of 36 of the Raven Coloured Matrices tasks serve as prototypes of the Set Vari-
ations, the students’ success in Raven Coloured Matrices as a whole cannot be attrib-
uted to learning these tasks. To succeed in solving Raven Coloured Matrices as a
whole the students should be able to generalise from the skills acquired during the
learning phase and transfer them to the tasks of a different nature. These are the defin-
ing features of cognitive modifiability. It should be admitted though that the graphic
modality of presentation remains the same, thus narrowing the modifiability scope.
If learning potential and cognitive modifiability are the same phenomenon then
students more successful in Set Variations should also be more successful in the
Raven Coloured Matrices post-test. On the other hand, if these two phenomena are
different we should find a considerable number of ‘mixed cases’ – students with good
learning potential but relatively low modifiability and vice versa. The following
experimental study was aimed at illustrating a possible empirical way to answer this
question.
Results
The correlations between the Raven Coloured Matrices pre- and post-test scores and
the Set Variations scores are presented in Table 1.
On the basis of their performance all students were defined as belonging to one of
the following groups: High Learning/High Modifiability; High Learning/Low Modi-
fiability; Low Learning/High Modifiability; and Low Learning/Low Modifiability.
Students were assigned to High or Low groups on the basis of their performance
being above or below the median score in the Set Variations and the Raven Coloured
Matrices post-test. Thus, for example, a student who performed above the median
Table 1. Correlations between Raven Coloured Matrices pre- and post-test scores and the
Set Variations scores (N = 88).
Raven Pre Set Variations
Raven Pre – 0.49
Raven Post 0.54 0.54
178 A. Kozulin
Table 2. Number of students belonging to high or low learning and modifiability groups (N =
88).
High Learning Low Learning
High Modifiability 32 12
Low Modifiability 12 32
score in both Set Variations and Raven Coloured Matrices post-test is defined as
belonging to the High Learning/High Modifiability group, while a student who
performed above the median score in Raven Matrices post-test but below the median
score in Set Variations is defined as belonging to Low Learning/High Modifiability
(see Table 2).
The Chi Square test was performed, testing the hypothesis that no more than
10% of the students will belong to the mixed groups, e.g. ‘high learning with low
modifiability’ and ‘low learning with high modifiability’. The Chi Square = 22.12,
df = 3, p < 0.001. The hypothesis should be rejected. Actually, 27.3% of the
students belong to these mixed groups. Moreover, in some students their learning
ability seems to be highly independent of both their initial performance level and
their post-training performance. Six students (6.8% of the group) belonging to the
High Learning group showed Low performance results in both pre- and post- Raven
Coloured Matrices tests.
One may conclude that both the absence of a stronger correlation between the Set
Variations scores and the post-test scores and the presence of a considerable number
of students in the mixed groups indicate that learning potential and cognitive modifi-
ability represent two different aspects of the students’ intelligence. The results
reported above indicate that even in the case of tasks of the same type (i.e. matrices),
learning potential appears to be distinct from cognitive modifiability. A student may
be quite efficient in learning from worked-out examples mediated by the DA assessor
and yet not capable of generalising and applying the thus learned strategies to a differ-
ent type of task. These results allow us to further elaborate the topic of learning trans-
fer discussed by Brown and Ferrara (1985). In their study prompts were used as a tool
for learning how to solve both the ‘near-’ and ‘far-transfer’ cognitive tasks. The
authors concluded that some students, efficient in learning from prompts when ‘near’
tasks were presented, proved to be less efficient in using prompts with ‘far’ tasks. In
other words, Brown and Ferrara identified different levels of learning potential. In the
present study, learning from worked-out examples (Set Variations) was contrasted
with independent transfer to ‘far’ tasks (Raven Coloured Matrices post-test). In this
way the difference between learning potential and cognitive modifiability rather than
a degree of learning potential was identified.
The ability of the examinee to grasp the principle underlying the initial problem and to
solve it; the extent to which the newly acquired principle is successfully applied in solv-
ing problems that become progressively more different from the initial task; the differ-
ential preference of the examinee for various modalities of presentation… (Feuerstein,
Rand, and Hoffman 1979, 94)
The path from the model to actual methodology has not been easy. Even in
Feuerstein’s own DA system (LPAD) the above model has been realised only
partially – many of the LPAD tasks do not have sufficient systematic variability in
terms of complexity, modality and operations. The lack of uniformity in different
180 A. Kozulin
tasks also greatly complicates their use as elements of the total modifiability system
(for a critical analysis see Buchel and Scharnhorst 1993).
One may conclude that the design of learning potential assessment procedures, espe-
cially of a purely cognitive nature, proved to be much easier than the design of the cogni-
tive modifiability system, especially if the latter is expected to respond to the question
of students’ propensity toward progressive movement to a higher cognitive level.
However, the main problem seems to be not in methodological difficulties associated
with the design of such systems, but in the conceptual decision regarding the goals of
DA. Developers and practitioners should be more conscious of whether they pursue
the goal of learning potential assessment or that of evaluation of cognitive modifiability.
Their tools and procedures should clearly reflect one or the other of these orientations
and be sufficient for achieving the chosen objective. Learning potential assessment will
probably become much more aligned with curriculum-based formative assessment,
while assessment of cognitive modifiability with psycho-developmental assessment
will be reformed along more dynamic lines.
Acknowledgements
The assistance of Lea Yosef and Anat Cagan in the collection of data is gratefully acknowledged.
Notes on contributor
Alex Kozulin is research director of the International Center for the Enhancement of Learning
Potential in Jerusalem.
References
Beckmann, J. 2006. Superiority: Always and everywhere? On some misconceptions in the
validation of dynamic testing. Educational and Child Psychology 23, no. 3: 35–50.
Berlyn, D.E. 2010. Thought. In Encyclopaedia Britannica. http://www.britannica.com/
EBchecked/topic/593468/thought (accessed March 5, 2010).
Brown, A., and R. Ferrara. 1985. Diagnosing zones of proximal development. In Culture,
communication and cognition, ed. J. Wertsch, 273–305. New York: Cambridge University
Press.
Buchel, F., and U. Scharnhorst. 1993. The Learning Potential Assessment Device: Discussion
of theoretical and methodological problems. In Learning potential assessment:
Theoretical, methodological and practical issues, ed. J.H.M. Hamers, K. Sijtsma, and
A.J.J.M. Ruijssenaars, 83–111. Amsterdam: Swets and Zeitlinger.
Budoff, M. 1987. The validity of learning potential assessment. In Dynamic assessment, ed.
C. Lidz, 52–81. New York: Guilford Press.
Chaiklin, S. 2003. The zone of proximal development in Vygotsky’s analysis of learning and
instruction. In Vygotsky’s educational theory in cultural context, ed. A. Kozulin, B.
Gindis, V. Ageyev, and S. Miller, 39–64. New York: Cambridge University Press.
Feuerstein, R. 1967. The learning potential assessment device. In Proceedings of the First
Congress of the International Association for the Scientific Study of Mental Deficiency,
ed. B.W. Richards, 562–5. Reigate, UK: Michael Jackson.
Feuerstein, R., Y. Rand, and M. Hoffman. 1979. Dynamic assessment of the retarded
performer. Baltimore, MD: University Park Press.
Fuchs, L., D. Compton, D. Fuchs, K. Hollenbeck, C. Craddock, and C. Hamlett. 2008.
Dynamic assessment of algebraic learning in predicting third graders’ development of
mathematical problem solving. Journal of Educational Psychology 100, no. 4: 829–50.
Grigorenko, E., and R. Sternberg. 1998. Dynamic testing. Psychological Bulletin 124, no. 1:
75–111.
Assessment in Education: Principles, Policy & Practice 181
Haywood, C., and C. Lidz. 2007. Dynamic assessment in practice. New York: Cambridge
University Press.
Hessels-Schlatter, C., and M. Hessels. 2009. Clarifying some issues in dynamic assessment.
Journal of Cognitive Education and Psychology 8, no. 3: 246–51.
Karpov, Y., and B. Gindis. 2000. Dynamic assessment of the level of internalisation of elemen-
tary school children’s problem-solving activity. In Dynamic assessment: Prevailing models
and applications, ed. C. Lidz and J. Elliott, 133–54. Oxford: Elsevier Science.
Karpov, Y., and D. Tzuriel. 2009. Dynamic assessment: Progress, problems, and prospects.
Journal of Cognitive Education and Psychology 8, no. 3: 228–37.
Kozulin, A. 2009. New reference points for dynamic assessment: A commentary on Karpov
and Tzuriel. Journal of Cognitive Education and Psychology 8, no. 3: 242–5.
Kozulin, A., and E. Garb. 2004. Dynamic assessment of literacy. European Journal of
Psychology of Education 19, no. 1: 65–77.
Kozulin, A., R. Kaufman, and L. Lurie. 1997. Evaluation of the cognitive intervention with
immigrant students from Ethiopia. In The ontogeny of cognitive modifiability, ed. A.
Kozulin, 89–130. Jerusalem: ICELP Press.
Lidz, C., ed. 1987. Dynamic assessment: An interactional approach for evaluating learning
potential. New York: Guilford Press.
Otis, A., and R. Lennon. 1996. Otis-Lennon school ability test. San Antonio, TX: Harcourt
Brace.
Poehner, M. 2008. Dynamic assessment: A Vygotskian approach to understanding and
promoting L2 development. New York: Springer.
Raven, J. 1956. Coloured progressive matrices. London: H.K. Lewis and Co.
Rey, A. 1934. D’un procédé pour évaluer l’éducabilité. Archives de Psychologie 24: 297–337.
Slavin, R. 1994. Educational psychology. Boston, MA: Allyn and Bacon.
Sternberg, R. 1985. Beyond IQ: A triarchic theory of intelligence. Cambridge: Cambridge
University Press.
Sternberg, R. 2010. Human intelligence. In Encyclopaedia Britannica. http://www.britan-
nica.com/EBchecked/topic/289766/human-intelligence (accessed March 5, 2010).
Sternberg, R., and E. Grigorenko. 2002. Dynamic testing. New York: Cambridge University
Press.
Vygotsky, L. 1934/1986. Thought and language. Rev. ed. Cambridge, MA: MIT Press.
Vygotsky, L. 1935/1996. Problema obuchenija i umstvennogo razvitija v shkolnom vozraste
[The problem of education and mental development at the school age]. In Pegagogicheskaja
psikhologija [Pedagogical psychology], 321–55. Moscow: Pegagogika Press.
Wechsler, D. 1974. Wechsler intelligence scale for children – revised. San Antonio, TX: The
Psychological Corporation.
Zuckerman, G. 2003. The learning activity in the first years of schooling. In Vygotsky’s
educational theory in cultural context, ed. A. Kozulin, B. Gindis, V. Ageyev, and
S. Miller, 177–99. New York: Cambridge University Press.