Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 21

International Journal of Mathematical Education in

Science and Technology

ISSN: 0020-739X (Print) 1464-5211 (Online) Journal homepage: https://www.tandfonline.com/loi/tmes20

Students’ reasoning in mathematics textbook task-


solving

Johan Sidenvall, Johan Lithner & Jonas Jäder

To cite this article: Johan Sidenvall, Johan Lithner & Jonas Jäder (2015) Students’ reasoning in mathematics
textbook task-solving, International Journal of Mathematical Education in Science and Technology, 46:4, 533-
552, DOI: 10.1080/0020739X.2014.992986

To link to this article: https://doi.org/10.1080/0020739X.2014.992986


International Journal of Mathematical Education in Science and Technology, 2015
Vol. 46, No. 4, 533–552, http://dx.doi.org/10.1080/0020739X.2014.992986

Students’ reasoning in mathematics textbook task-solving


Johan Sidenvalla∗, Johan Lithnerb and Jonas Jädera

a Department of Social and Welfare Studies, Linköping University, Sweden; b Umeå Mathematics
Education Research Centre, Umeå University, Sweden
(Received 21 July 2014)

This study reports on an analysis of students’ textbook task-solving in Swedish up-


per secondary school. The relation between types of mathematical reasoning required,
used, and the rate of correct task solutions were studied. Rote learning and superficial
reasoning were common, and 80% of all attempted tasks were correctly solved using
such imitative strategies. In the few cases where mathematically founded reasoning was
used, all tasks were correctly solved. The study suggests that student collaboration and
dialogue does not automatically lead to mathematically founded reasoning and deeper
learning. In particular, in the often common case where the student simply copies a so-
lution from another student without receiving or asking for mathematical justification,
it may even be a disadvantage for learning to collaborate. The results also show that
textbooks’ worked examples and theory sections are not used as an aid by the student
in task-solving.
Keywords: mathematical reasoning; task-solving; mathematics textbook; upper
secondary school

1. Introduction
Reasoning is a fundamental aspect of mathematics.[1–3] Research has shown that there is
too much emphasis on rote learning and superficial reasoning in mathematical education.[4]
According to Hiebert [4] ‘One of the most reliable findings from research on teaching and
learning is that students learn what they are given opportunities to learn’ (p.10). To learn
both procedures and mathematically founded reasoning students must practise how to
solve both routine and non-routine tasks.[5] The textbook is an important source for giving
students an opportunity to learn.[6] This study does not primarily examine the textbook
itself as an opportunity to learn, nor does it focus on what is actually being learned. The
study focuses on students’ solving textbook tasks related to their reasoning; examining how
the type of reasoning affects the ability to solve textbook tasks. Rote learning is related
to students’ tendency to use sometimes inefficient and mathematically superficial imitative
strategies rather than creating their own solutions through reasoning.[7–10] Tasks are a
cornerstone of students’ work with mathematics and according to Doyle [11] ‘influence
students by directing their attention to particular aspects of content and specifying ways of
processing information’ (p. 161). The overall purpose of this study is to examine to what
extent students create their own solutions through mathematically founded reasoning to
solve textbook tasks, and to relate the students’ reasoning with their ability in completing
the tasks.


Corresponding author. Email: johan.sidenvall@liu.se

O
C 2014 Taylor & Francis
534 J. Sidenvall et al.

This paper has the structure as follows. In the Background the importance of the textbook
and its tasks are shown upon, followed by a background of the reasoning competence. In
the Research framework section the research framework is presented. The Methods section
includes description of both data collection and data analysis. Results are presented in the
Results section. The paper ends with a Discussion of the results.

2. Background
2.1. The textbook and textbook tasks
Textbooks have been shown to have a great impact on classroom work and to form the
backbone of mathematical teaching both in Sweden [12–14] and internationally.[6,15–18]
The textbook tasks that students engage in largely determine what they learn about
mathematics and how they learn it.[19] By studying data from the Third International
Mathematics and Science Study, Valverde et al. [18] found that a majority of the textbooks
for 9- and 13-year-olds and students in the final year of secondary school mainly focused on
practising routine skills. A low proportion of the textbooks’ content aimed at investigation,
problem-solving or making mathematical generalizations. An international textbook task
analysis [20] showed that around 75% of the tasks in a common Swedish secondary
school mathematics textbook were solvable by mimicking previously presented algorithmic
templates. Twenty-five per cent of the tasks required at least some mathematically founded
reasoning to be solved. The study also showed that a majority of the tasks requiring
mathematically founded reasoning were classified by the textbook authors as more difficult
tasks.

2.2. Reasoning
Ball and Bass [21] state that ‘mathematical reasoning is no less than a basic skill’ (p.28).
In the following, a broad definition of reasoning is applied. This means that reasoning can
be found in all levels of mathematical understanding. In the literature ‘reasoning’ is often
defined as a skill of high deductive-logical quality.[22] This is not the case in this study;
our overall assumption is that mathematically founded reasoning can be found and used
at all levels of difficulty in solving mathematical tasks. This is also in line with national
curricula and research concerning what competences should be developed.[1,2,4,23] The
definition of reasoning used here is: ‘reasoning is the line of thought adopted to produce
assertions and reach conclusions in task-solving. It is not necessarily based on formal logic,
thus not restricted to proof, and may even be incorrect as long as there are some kind of
sensible (to the reasoner) reasons backing it’.[24,p.257] Empirical studies [7,10,25–29]
using this wider definition of reasoning lead to a division into imitative reasoning and
creative mathematically founded reasoning (CMR). To use imitative reasoning is to solve
tasks by methods that are known or provided by someone else (e.g. a teacher or a textbook).
To use CMR is to use a new, mathematically founded and intrinsic line of argument (see
Research framework section).[24] When opportunities to learn CMR are given it will
also lead to opportunities to develop other related important mathematical competences:
problem-solving and conceptual understanding.[24] Such competences are not developed
by imitative reasoning alone.[4,5,24]
A framework of analysis will be used to categorize the reasoning used by students to
solve a task, and whether or not they succeed. Another framework of analysis will be used
International Journal of Mathematical Education in Science and Technology 535

to categorize the anticipated reasoning required for a task according to the level of the
mathematical course. Both frameworks use the theoretical framework by Lithner.[24]

3. Aim and research questions


The aim of the study is to examine how the type of reasoning affects the possibilities to solve
textbook tasks in a Swedish upper secondary school context. In order to do this we will (1)
categorize what kind of reasoning is required in order to solve the tasks, (2) categorize the
students’ reasoning, and (3) analyse how (1), (2) and students’ correct task solution rate are
related. The aim of the study is specified in the following research questions:

• What is the relationship between the reasoning required and the reasoning used when
solving textbook tasks?
• What is the relationship between the reasoning used and the rate of correct solutions
when solving textbook tasks?

4. Research framework
This study is based on a research framework formulated by Lithner [24] in which reasoning
in task-solving is seen as a product of a student’s thinking process. Reasoning is seen as
data in empirical studies, which can be documented through text, graphical representation,
video recordings, etc., and these data are seen as traces of the students’ thoughts.[10]
The thinking processes are dependent on the students’ mathematical competences. In turn,
these competences are formed by the sociocultural milieu.[24] The task-solving process
according to the framework comprises the following four steps:

(1) A task is met.


(2) A strategy choice is made, where ‘strategy’ ranges from local procedures to general
approaches, and ‘choice’ is seen in a wide sense (choose, recall, construct, discover,
guess, etc.). The strategy choice can be supported by predicative argumentation:
Why will the strategy solve this task?
(3) The strategy is implemented, which can be supported by verificative argumentation:
Why did the strategy solve the task?
(4) A conclusion is obtained.

The characterization of reasoning types is based on analyses of the explicit or implicit


arguments in the strategy choice and implementation. The two main reasoning types,
imitative and CMR, are defined below.

4.1. CMR
Imitative reasoning is often suitable for solving routine tasks. The six different types of
imitative reasoning are as follows.
Memorized reasoning is characterized by the strategy of recalling a complete answer
(e.g. a proof) and the implementation of this choice consists of merely writing it down.
Many school tasks normally require calculations where it is more appropriate to recall
an algorithm rather than the answer. This provides a basis for the other main type of
imitative reasoning, algorithmic reasoning (AR). In this regard, ‘An algorithm is a finite
sequence of executable instructions which allows one to find a definite result for a given
536 J. Sidenvall et al.

class of problems’.[30,p.129] The nth transition does not depend on any circumstance
unforeseen in the (n − 1)th transition; nor on finding new information, any new decision,
any interpretation, and thus on any meaning that one could attribute to them. Thus, an
algorithm can be determined in advance. AR is characterized by a strategy of searching
for or recalling a certain algorithm that is thought to be suitable and then implementing
this algorithm in task-solving.[24] ‘The remaining parts of the strategy implementation are
trivial for the reasoner, only a careless mistake can prevent a correct answer from being
reached. How to identify a suitable algorithm is fundamental, and the rest is relatively
straightforward’.[29,p.225] AR can be further divided into five subgroups in which an
algorithm is obtained: familiar AR,delimiting AR, and three types of guided AR (peer-
guided AR, teacher-guided AR andtext-guided AR).[7,24] The different AR types are as
follows.
Familiar AR is connected to a strategy of searching for familiar (perhaps superficial)
clues or leads that in turn trigger a strategy choice of which algorithm to choose. A student
might use familiar AR when solving an equation where the student knows the corresponding
algorithms.
Delimited AR may be employed when a student cannot connect a familiar algorithm
to the task. Instead, an algorithm is chosen from a set that is delimited by the student
through the algorithms’ surface relations to the task. If the implementation does not lead
to a reasonable conclusion (to the reasoner) the reasoning sequence is simply terminated
and another algorithm may be chosen from the set. An example from Bergqvist et al. [7]
illustrates a student’s use of delimited AR: in Sally’s attempt to solve ‘Find the largest and
smallest values of the function y 7=3x x+2 on the interval [1, 5]’. She differentiates
y, solves y t (x) = 0, (x = 1, 5) and evaluates y (1, 5) 9, 25. She hesitates: ‘I think that
=
I should have got two values, and− I don’t know why I didn’t, what have I done wrong’.
She abandons this method without reflection since it did not produce an expected answer.
Instead she moves on to two different methods using a graphic calculator. These subsequent
attempts are also abandoned since they do not produce an expected answer. Sally finally
makes the incorrect strategy choice to solve 7 +3x − x2 0 and obtains two values,
x1 ≈ 4, 54 and x2 ≈ = solution even though this does
4, 54, which she regards as a correct
not− solve the task.
Peer-guided AR occurs when the reasoner is guided through the task by a peer who
describes the solution procedure. The strategy choice is to follow the peer’s guidance and
the implementation is in simply executing the algorithm without verificative argumentation.
Teacher-guided AR is similar to peer-guided AR with the difference that the teacher serves
as a guide.
Text-guided AR is used when, during his/her strategy choice, a student identifies surface
similarities between the task and an example, definition, theorem, rule, or some other
situation in a text source. The algorithm is implemented without verificative argumentation.

4.2. Creative mathematically founded reasoning


In imitative reasoning the task-solver applies a recalled or externally provided solution
method. In CMR the solver constructs a solution method. The empirical studies forming
the basis of the reasoning framework [24] have identified three central aspects distinguishing
CMR from imitative reasoning:

• Novelty. A new (to the reasoner) reasoning sequence is created, or a forgotten one is
re-created.
International Journal of Mathematical Education in Science and Technology 537

Table 1. Types of reasoning defined in the reasoning framework of Lithner.[24]

Creative
mathematically
founded reasoning Imitative reasoning
Memorized Algorithmic
Global Local reasoning reasoning, AR

Peer- Teacher- Text-


Familiar Delimited guided guided guided
AR AR AR AR AR

• Plausibility. There are arguments supporting the strategy choice and/or strategy im-
plementation motivating why the conclusions are true or plausible.
• Mathematical foundation. The arguments are anchored in the intrinsic mathematical
properties of the components involved in the reasoning.

Imitative reasoning contains no CMR. Local CMR always contains considerable parts
of imitative reasoning (e.g. recalling facts and using algorithmic subprocedures) and minor
parts of CMR. Global CMR may contain large parts of imitative reasoning but always
contains significant parts of CMR.
Table 1 shows schematically CMR and imitative reasoning with their subcategories.

4.3. Textbook task analysis


An analysis of textbook task is performed considering what type of reasoning is required
in order to solve the task. The essence of the method is that the reasoning required of the
task depends on the previous information in the textbook. Empirical studies [20,26,27]
have used this method to categorize the reasoning requirements of a task. If a task can be
solved by using algorithmic templates previously presented in the textbook’s theory section,
a worked example, or in previous tasks, it is categorized as only requiring text-guided AR
to be solved. If the task has no algorithmic templates and CMR is needed for the overall
solution strategy then the task is categorized as requiring global CMR. If minor changes
have to be made in the presented algorithmic templates to solve a task, the task is categorized
as requiring local CMR to be solved.

5. Methods
5.1. Method of data collection
Data were collected from two Swedish upper secondary schools of about average size.
Upper secondary school is not a compulsory part of the Swedish school system, but 98%
of students from the compulsory lower secondary school continue to upper secondary
school. The overall data collection method adopted in this study is based on Långström and
Lithner’s [28] methods, which enables collection of data during normal classroom work.
Data were collected from two classes from the least mathematically intense programme,
one class from the intermediately intense programme and two classes from the most mathe-
matically intense programme. This was done to cater for the case that students participating
in different tracks might have different mathematical competences. All students attended
538 J. Sidenvall et al.

the first year of upper secondary school, equivalent to the 10th year of schooling. The
teachers were professionally trained and had at least 10 years of teaching experience.
The lessons (50–60 minutes) were alike in structure and execution. A typical lesson
commenced with a teacher’s presentation (10–25 minutes) followed by student work with
textbook tasks (30–45 minutes). The teacher did direct the students to a specific section of
the book. It was to a large extent up to the students to decide what tasks and at what level
of difficulty to do in the directed section. The teacher moved around the classroom helping
students in their task-solving. Following the teachers’ presentations, the video recordings
began. Two cameras were placed in the classroom. During the first minutes of the students’
work with textbook tasks, two groups of students were chosen for the data collection. A
camera and microphone were placed in close vicinity to each group. The criteria used when
choosing groups of students were that they seemed to be mathematically active and that a
mathematical dialogue was in progress. The students’ notebooks and textbooks were also a
part of the data. Field notes were taken throughout the lessons. The work from seven groups
of students from four classes was analysed. Six groups consisted of 2 students and one group
consisted of 3 students, giving a total of 15 students. A class from the least mathematically
intense track was excluded from the analysis due to the lack of mathematical task-solving
observed in the groups.

5.2. Method of data analysis


5.2.1. Reasoning analysis procedure (reasoning used)
The analysis procedure was based on previous empirical studies.[10,26,28] In applying the
framework to analyse what reasoning a student used when solving a task, the first step
was to structure the data by studying video-recordings, transcripts, students’ notebooks and
field notes. To conduct a more fine-grained analysis, the second step was to identify sub-
tasks in the students’ task-solving, and subdivide the subtasks using the four-step reasoning
sequence. Following this, any observed predicative and verificative arguments were iden-
tified in each subtask situation. The reasoning sequence was then classified according to
the reasoning types of the framework. Finally, each subtask solution attempt was classified
as correct or incorrect, unless the incorrectness is only due to a careless mistake and the
solution is otherwise correct.
In order to categorize the reasoning of a solution for a whole task, an analysis of
the subtasks was made to investigate which of the subtasks had the decisive reasoning
sequence. The decisive reasoning sequence was then considered as representative of the
overall reasoning for the whole task. The overall solution attempt was also classified as
correct or incorrect.

5.2.1.1. Example of data and analysis of familiar AR. Data: Adam solves the task below
(Figure 1) straightforwardly by identifying the algorithm connected to a familiar clue or
lead, here as a geometrical figure where the centre angle is given and the boundary angle
is sought. ‘It was like this, I think that it [angle v] is half [of 190◦ angle]. Isn’t that right?’
‘Calculate angle v’ [31,p.159] (Figure 1).
Analysis: Since Adam states: ‘I think that it [angle v] is half [of 190◦ angle]’ it is clear
that he knows the inscribed angle theorem and thus an algorithmic solution method. If
he had not already been familiar with the theorem he would have to construct it during
the task-solving session, which is very difficult. This would take much longer time than
what he used and there would be some traces of this construction in the data, for example
International Journal of Mathematical Education in Science and Technology 539

Figure 1. Figure for task solved using familiar AR.

drawings and probably also some explicit arguments justifying the construction. Therefore
the reasoning is categorized as familiar AR.

5.2.1.2. Example of data and analysis of delimited AR. Data: Molly is trying to solve the
task ‘What are the zeroes of the function = y x 2 −6x 5?’.[32,p.108] Molly chooses a
suitable method, does the following calculations, but + makes the careless mistake of adding
and subtracting 4 instead of 2 in the final step:

y = x2 − 6x + 5, y = 0, 0 = x2 − 6x + 5, x = 3 ± 9 − 5, x1 = 7, x2 = −1
(1)

Molly then compares her result with the answer in the answer section. She sees that
she has produced an incorrect answer. Molly then tries, without any further reflection, an
algorithm that the teacher has presented on the blackboard to calculate a function’s value.
This algorithm can be used to check if given x-values are zeros of the function or not,
but it is not suitable for finding such values (i.e. to solve the task). However, she uses the
algorithm presented by the teacher in the hope that it might solve the task. She arrives at
the following result:

y1 (7) = 12, y2 (−1) = −18 (2)

There is no indication of the student’s reflection that the calculated x-values should give
y = 0 and not the two function values as in (2) if they were solutions to the task. Molly’s
540 J. Sidenvall et al.

Figure 2. Figure for task solved using text-guided AR.

calculation (2), which she sees as an answer to the task, she is confused and she seems to
lose track of what she is doing when the answer does not match the answer in the answer
section. She states: ‘Well, this did not go so well’ and abandons the task.
Analysis: The first calculation (1) is categorized as familiar AR since Molly chooses a
known algorithm, although she makes a computational error. Her second calculation (2) is
categorized as delimited AR. The reason for this classification is that instead of analysing
the outcome of the chosen algorithm (1) she merely abandoned it without reflection and
continued trying to solve the task by using a new algorithm (2) that yields two numbers as
an answer. She hopes that this algorithm might solve the task. When the first algorithm does
not provide her with an answer that is the same as in the answer section, the second algorithm
is selected on the basis that it is considered somehow connected to finding function values,
but Molly does not understand how or why. The strategy implementation is then carried out
by following the algorithm. No verificative argumentation is required.

5.2.1.3. Example of data and analysis of text-guided AR. Data: The task below is solved
by David by searching in the textbook for a suitable algorithm. The bisector theorem is
found and used to solve the task. ‘CD is a bisector. Determine AD’ [33,p.160] (Figure 2).
Analysis: The strategy choice concerns identifying surface similarities between the task
and the theorem. In this case the figure beside the theorem is similar to the figure in the
task. The algorithm is implemented without verificative argumentation.

5.2.1.4. Example of data and analysis of peer-guided AR. Data: Simon is trying to solve
task (d) below.

When Jesper buys a TV for 15000 crowns he is offered to a yearly payment plan over five years
instead of paying cash. Every year a fifth of the debt is paid off. The interest rate is 5.75%.

(a) How much must Jesper pay to the bank the first year?
(b) How much must Jesper pay to the bank the second year?
(c) How much more will the TV cost if it is paid through the payment plan compared to
paying in cash.
International Journal of Mathematical Education in Science and Technology 541

Figure 3. Figure for task solved using teacher-guided AR.

(d) How much more expensive (in percentage) is the TV when it is paid off in instalments?
[34,p.107]

Simon seems to know that he must compare the cash price with the sum of the cash
price plus interest, but does not know how to make the comparison himself. He then turns
to his peer and asks: ‘Or should I take the new [value] divided by the old [value]?’ The
peer then responds with: ‘The new by the old. That divided by that [pointing in Simon’s
notebook]’. Simon then writes the correct computation without asking for an explanation
as to why this algorithm leads to the correct answer.
Analysis: The strategy choice that is problematic for Simon was already made by a peer.
The strategy implementation followed the guidance and execution of the remaining routine
transformations.

5.2.1.5. Example of data and analysis of teacher-guided AR. Data: Adam is trying to
solve the task ‘Determine angle AOB in [Figure 3]’.[33,p.159] Adam asks the teacher for
help. The teacher reads the inscribed angle theorem to Adam and states that the centre angle
is always double the angle on the circumference. Adam completes the task.
Analysis: The strategy choice is to ask the teacher, which leads to a strategy implemen-
tation guided by the teacher. No predicative or verificative arguments are visible.

5.2.1.6. Example of data and analysis of (local) CMR. Data: Lars is working on the task
‘Determine x’ [31,p.265] (Figure 4) with his peer John. Lars and John have the following
dialogue:
542 J. Sidenvall et al.

Figure 4. Figure for task solved using CMR.

Lars 5... 3x, 5,1 [x]


John Yeah, I was wondering.
Lars If you like add all the x:es and divide by 180. I don’t know.
John Eeh, what...
Lars Or, how should I solve it?
John You have to determine x. That means that you have to determine each angle.
Lars Yeah.
John Yeah.
Lars That was what I meant. But look, maybe it’s 5 times x [pointing at 5x in the
figure] and that [pointing at 3x in the figure] is 3 times x.
John Yes, but...
Lars That [pointing at x in the figure] is x, maybe. Something like that, maybe.
John The only thing we know...
Lars Yeah?
John . . . is that the triangle is 180.
Lars Yes, that is what I meant.
John Yeah, ehh. And then, all there is, is a . . .
Lars What if, all you have to do is like add all the x:es? Then that is 8x [pointing at
3x and 5x in the figure], and with that [pointing at x in the figure] it becomes
9x. And then you divide that by 180. The answer you get, you multiply by 3.
And that should be [angle with the 5x]... there you take times 5. And there
[angle with the 5x] you only take x itself.

The task is then correctly solved.


Analysis: First, it is evident that Lars constructs, at least partially, a novel reasoning
sequence; the sum of the angles is 9x, and then division of 180 with 9. Second, his arguments
for his strategy choice to add 5x, 3x and x are implicit and not explicitly communicated. His
interpretation of 5x meaning 5 times the size of the angle x is strengthened as verification
that the sum of the angles add up to 180◦. Lars finds his solution graphically plausible; the
angles seem to be as large as calculated. As a whole, this makes his reasoning sequence
plausible. Finally the arguments used are: the sum of the angles of a triangle is 180 ◦, and
the properties of a variable [x] provide a relation between the angles. Even though Lars is
unsure, the arguments used are grounded in mathematical intrinsic properties. The
component of the reasoning sequence that Lars is to construct – thereby using CMR – is
representing the relationship between the angles by using a variable. If considering each
International Journal of Mathematical Education in Science and Technology 543

step in the solution individually, adding the angles and dividing by the number of x’s, the
solution can be considered as comprising of known algorithms at a year 10 mathematics
level. This reasoning sequence is categorized as local CMR because Lars uses CMR
to solve the task, although the solving process contains large portions of AR. If larger
part of a solution would consist of CMR, it could have potentially been categorized as
global CMR.

5.2.2. Task analysis procedure (reasoning required)


The task analysis procedure adopted to categorize the reasoning required to solve a task
is based on establishing whether text-guided AR is possible or if CMR is required. The
method was introduced by Lithner [24] and Palm et al. [29] and has been further developed
by Jäder et al.[20] The first step is to identify possible algorithms that could solve the task.
Second, a search for solved examples, presented rules, theorems, facts, and tasks for text-
guided AR information that is the same or similar to that of the task is conducted. Finally,
a categorization of the task as global CMR, local CMR or text-guided AR depending on
possible similarities between the task and the text-guided AR information is made.

5.2.2.1. Example of data and analysis. Data: The task example (see ‘Example of data
and analysis of peer-guided AR’, task (a)) is about a TV that is bought via a payment plan.
Analysis: (1) Possible algorithms are identified for the tasks; e.g. the algorithm for
solving the task could be 15,000
5 = 3000, 1.0575 × 3000 3172.5. (2) On the page before
the page where the task appears a worked example=containing the same type of task question,
information is available and showing the same algorithms that are needed to solve the task
with the TV. The only difference is that the worked example is in the context of buying a
car instead of a TV. (3) The task is categorized as only requiring text-guided AR because
of the close similarities with the worked example.

5.2.3. Validity
The data analysis, categorization of the reasoning used and the reasoning required were
done by two coders to establish interrater reliability. The two coders both categorized all
the textbook tasks; the required reasoning. They reached an agreement on 37 of the 39
tasks (95%). The categorization of the reasoning used was first done by one coder. The
categorizations that were considered borderline-categorizations (5/122) by the first coder
were coded by a second coder. In four of these five cases the two coders had done the same
categorization. The discrepancies in the categorization of both used and required reasoning
were resolved through discussion.

6. Results
The textbook task-solving performed by seven groups of students was analysed. Overall,
the students worked on 86 textbook tasks that were divided into 122 subtasks, 106 of these
subtasks contained sufficient data in the form of oral communication to be analysed.
The textbook tasks were labelled according to how difficult the textbook authors con-
sidered the tasks to be. All the textbooks pitched the tasks at three levels of difficulty.
Data showed that students rarely attempted the more difficult tasks. Eighty-four per cent
of the encountered tasks belonged to the easiest level of difficulty. Sixteen per cent of the
attempted tasks belonged to the intermediate difficulty. No attempts were made to solve
544 J. Sidenvall et al.

the most difficult tasks. Since the students mainly worked on the least difficult tasks this
indicates that the cognitive demand of the tasks attempted can be assumed not to be at
the highest level. In addition none of the students worked on tasks belonging to activity
sections (headings containing e.g. ‘Activity’, or ‘Explore’).

6.1. The relation between reasoning used and reasoning required


Altogether, 15 students attempted 86 tasks, made up of 39 unique tasks. As displayed
in Table 2, the analysis of the textbook tasks showed that 84% (72/86) of the tasks only
required text-guided AR using algorithmic templates (e.g. solved examples) presented in
the textbook. These tasks are referred to as guided AR tasks below. Thirteen per cent
(11/86) required local CMR and 5% (4/86) required global CMR. The most frequent types
of reasoning used were familiar AR, 43% (37/86) and peer-guided AR, 29% (25/86). Local
CMR was used in 7% (6/86) of the tasks, while global CMR was used in only one (1/86)
of the tasks.
Using AR on CMR tasks occurred on seven occasions. When this occurred, the solution
attempts were not always correct. But when guided by another student (i.e. peer-guided
AR) who used CMR to solve the task in advance the task was correctly solved. Tasks that
did not require CMR to be solved were primarily carried out using AR.
The least common type (14/86) of tasks were those that required CMR in order to
be solved. These tasks seemed to elicit the use of CMR, since they were all attempted
through the use of CMR by at least one student in each group. The tasks were solved
on six occasions by global CMR or local CMR and five were solved by peer-guided
AR. A typical way to solve a CMR task in pairs is represented by the dialogue shown
previously between Lars and John upon solving a task (see Section 5.2.1.6). Lars solved
the task by using local CMR, while John was guided by Lars. Thus, John’s reasoning
about the CMR task was categorized as being peer-guided AR. One possible reason why
CMR tasks were solved using CMR was that the students had no access to AR. It is not
self-evident that a CMR task always results in the use of CMR. In the attempt to solve
two CMR tasks, familiar AR was used without being able to solve the task correctly.
Another possible reason why the students in all groups solved these CMR tasks was
that these tasks were not among the most complex ones and were conceptually relatively
simple.
Guided AR tasks were almost always solved via AR strategies. Familiar AR was the
most frequent reasoning type adopted to solve guided AR tasks. This can be considered

Table 2. Frequency of occurrences of reasoning required and reasoning used in solving textbook
tasks.

Reasoning used

CMR AR

Peer- Teacher- Text- Not cat.


Global Local FAR DAR GAR GAR GAR able Total

Reasoning GCMR 1 0 2 0 1 0 0 0 4
required LCMR 0 5 0 0 4 0 0 1 10
GAR 0 2 35 2 20 8 2 3 72
Total 1 7 37 2 25 8 2 4 86
International Journal of Mathematical Education in Science and Technology 545

an expected result since the students had worked with the used algorithms previously in
their textbook and/or the teacher had presented the method for the students. One reason for
using familiar AR was that the student would recognize the task type and could then link
the task to a known algorithm using a clue or a lead strategy. An example of this is where
Adam solved a task via the inscribed angle theorem (see Section 5.2.1.1.).

6.2. The relation between reasoning used and rate of correct solutions
Eighty-two per cent of the tasks were correctly solved. When using global CMR or local
CMR, 100% of the tasks were solved correctly, while 80% of the tasks were solved correctly
when using AR. Table 3 shows that all tasks solved through global CMR, local CMR and
text-guided AR yielded a correct solution. However, these three reasoning types rarely
occurred.
To further explore the relationship between reasoning used and the rate of correct task
solutions a more fine-grained analysis was carried out by examining the relation between the
subtask reasoning used and the rate of correct solutions when solving a subtask. Seventy-two
per cent (76/106) of the subtasks were correctly solved. The most common way of trying to
solve a task was via AR, which was used in 92% (98/106) of the subtask solution attempts.
Memorized reasoning was only used once (1/106). No further analysis was carried out
concerning memorized reasoning because it rarely occurred. The most frequent reasoning
type was familiar AR, which was observed in 44% (47/106) of all the task-solving attempts.
Peer-guided AR was the second most common type of reasoning and accounted for around
a quarter (27/106) of the solution attempts. Teacher-guided AR was the third most common
reasoning type and emerged in 11% of the task-solving attempts (12/106). Table 4 shows
the proportion of correct solutions for each of the reasoning types used.
One possible explanation why tasks were correctly solved when using CMR (100%,
7/106) could be that the limited depth of the task made the CMR part very minor. Another
possible explanation why tasks were correctly solved when using CMR was that the student
engaged himself or herself in some sort of struggle.[35] By ‘struggle’, Hiebert and Grouws
[35] mean ‘that students expend effort to make sense of mathematics, to figure something
out that is not immediately apparent’ (p.387). This struggle might be exemplified in Lars’s
task solution presented above (see Section 5.2.1.6.). Lars could not use peer-guided AR
since his peer, John, was seeking help from Lars. Lars also did not use the answer section,
the textbook or the teacher’s assistance to reach a solution. Rather it was by solving a task
that was conceptually within reach and with mathematical ideas that were understandable
but not yet well formed [36] that led to the use of CMR and a correct solution. Since
the students in the study rarely used CMR, especially global CMR, it was difficult to
hypothesize further reasons for using CMR.
The use of familiar AR that leads to correct solutions was connected to if the student
could identify what algorithm to use by searching for familiar clues or leads in the task

Table 3. Students’ correctly solved tasks using different types of reasoning.

Peer- Teacher- Text- Other


GCMR LCMR FAR DAR guided AR guided AR guided AR AR∗

100% (1/1) 100% (7/7) 81% (30/37) 0% (0/2) 92% (23/25) 63% (5/8) 100% (2/2) 50% (2/4)
∗Categorized as AR strategy but analysis has not been able to clarify what type of AR.
546
Table 4. For each reasoning type the entries display the proportion of a correct solution for a subtask.

CMR Imitative reasoning

J. Sidenvall et al.
Global Local Memorized reasoning AR

FAR FDAR∗ DAR Peer-GAR Teacher-GAR Text-GAR Total

100% (1/1) 100% (6/6) 100% (1/1) 68% (32/47) 0% (0/3) 0% (0/3) 90% (26/29) 46% (6/13) 67% (2/3)
100% (7/7) 70% (69/99) 72% (76/106)
∗In three cases it was not possible to distinguish if the reasoning used was familiar AR or delimited AR.
International Journal of Mathematical Education in Science and Technology 547

that in turn triggered a strategy choice on which algorithm to choose. When Adam solved
a task by using the inscribed angle theorem (Section 5.2.1.1.), he identified and used a
familiar algorithm, in a straightforward manner and without verificative arguments. Besides
checking the answer with the answer section, explicit verifications were rare in the cases
where the students believed that they had identified and used the correct algorithm (i.e.
using familiar AR). When the strategy choice was based on surface properties it sometimes
led to an incorrect algorithm choice, which naturally resulted in an incorrect answer. This
superficial way of solving tasks was the main reason for incorrect solutions when using
familiar AR.
The reason why almost all subtasks that were approached using peer-guided AR were
correctly solved was that the peers guiding the students had in most cases already correctly
solved the task themselves. Usually whole solutions, otherwise essential parts of a solution
were presented to the student under guidance. This is illustrated by Simon’s solution of how
much more it costs to pay for the TV via instalments (Section 5.2.1.1.). In the majority of
such cases, the student receiving the information did not ask for explanations, nor did the
peer giving the guidance provide any explanations. One may note that this can be paralleled
to the didactical contract, introduced by Brousseau [30], between a teacher and her student.
The didactical contract is an (mainly implicit) agreement between the teacher and the
student determining their respective roles in the mathematics classroom. The students of
this study are acting as if there was an (implicit) agreement that the guide does not have
to provide any justifications for the choices and claims, and that the guided student does
not ask for such justifications. In some peer-guided AR the students carried out a simple
but not mathematically complete verification of the answer. The verifications that did take
place often concluded that the answer seemed reasonable in relation to the task context.
Text-guided AR was seldom used (3/106). It was successful in two cases where the
students could connect the task to a presented theorem in the textbook via surface similari-
ties. Text-guided AR might be more common among students who do not work together in
task-solving.[10]
None of the instances where a student requested help from a teacher led to CMR by
the student. Instead teacher assistance often resulted in the use of teacher-guided AR by
the student. Teacher-guided AR was one of the reasoning types that led to least correct
solutions. Seven of the 13 subtasks that used teacher-guided AR were incorrectly solved.
One identified reason was that the teacher did not provide adequate assistance for the
students to make progress during their task-solving. For example, the teacher thought that
the student needed help to complete one algorithm, while the student actually needed
help with a different algorithm or probably with understanding a concept. The reason for
generating correct solutions when using teacher-guided AR was that the teacher led the
student through the task.
When a student tried to solve a task by using delimited AR, this, by definition, involved
using the surface properties of the task. To try to solve a task using delimited AR the student
may have used a number of algorithms at hand that are delimited by the specific situation.
However, it would be more or less a matter of chance that the student would have used the
correct algorithm. This was why delimited AR frequently led to incorrect solutions. No
task in the study was correctly solved using delimited AR.

6.3. Use of the textbook


Even when algorithmic templates (e.g. solved examples) were available students hardly
used the worked examples of the textbook in their task-solving. Most interesting were the
548 J. Sidenvall et al.

Year Remaining Yearly To pay to

loan interest the bank

1 30 000

Figure 5. Figure for task solved using answer section.

incorrect solutions where students did not turn to the textbook to seek assistance. Twenty-
nine of 106 subtasks were incorrectly solved. In almost all of these cases (23/29) there was
a corresponding solved example that had an algorithm that could have been applied to solve
the task (i.e. text-guided AR information was readily available, but not used).
Students focused clearly on finding an algorithmic method to produce an answer, rather
than on understanding the task and solution method. This was emphasized in the frequent
use of textbooks’ answer sections, which played a paramount role in students’ task-solving.
The answers in the answer section were used as a hint to progress in solving the task, or
as a means of verifying an answer. The answers to a task often assisted a student to make
task progress using an AR strategy. Ann worked on the following task: ‘A loan of 30,000
crowns will be amortized with three equally large amounts over three years. The interest
rate is 5.20%. Complete the table’. See Figure 5.
Ann first attempted to complete the column ‘Remaining loan’, but could then progress
no further. Her strategy to complete the column was to consult the answer section and view
the completed table. Reading the table in the answer section led her to an algorithm of
subtracting 10,000 crowns for every year. A similar process was used for completing the
remaining columns; not knowing how to proceed, she consulted the answer section which
led to an algorithm and a solution.

6.4. Summary of main results


The students in this study did not attempt to solve more complex or difficult tasks in the
textbook. The relation between the reasoning required and the reasoning used was that AR
was almost always used when students attempt text-guided AR tasks. CMR was seldom
used. One important relationship between used reasoning and the rate of correct solutions
was that peer-guided AR often led to a correct solution since the peer doing the guiding
had often already completed the task. All the tasks that were attempted with CMR were
correctly solved. This was also the case with the use of text-guided AR, even though this
use was very rare. Students often turned to peer-guided AR when solving a task. When
peer-guided AR was used there was little focus on obtaining and providing explanations as
to why a certain solution and answer were correct. None of the occasions where the teacher
assisted a student with a task led to CMR by the student.
International Journal of Mathematical Education in Science and Technology 549

7. Discussion
This study shows that there was a discrepancy between what students actually do during
their task-solving and mathematical competence of reasoning that is stated as important
by research and the Swedish national curricula.[1,2,23] The results of this study suggest
that students engage in CMR to a very limited extent or not at all while solving textbook
tasks.

7.1. Worked tasks


Students mainly worked with the least difficult tasks. Following the work by Jäder et al.,[20]
if students were to work on more challenging tasks and more on tasks in activity sections,
or if there was a larger proportion of CMR tasks among the less difficult tasks, the students
would be given more opportunities to learn CMR. Since CMR is a competence that all
students are acquired to develop to inhabit a broad mathematical competence, it is essential
for textbooks to include more CMR tasks designed at a less difficult level.

7.2. Guided AR as a hindrance in developing CMR


In solution attempts in which students failed to solve the task on their own, students often
turned to peer-guided AR or to teacher-guided AR to solve it. These strategies often led to
a correct solution. But they can be seen as missed opportunities to learn CMR. This was
particularly clear in the case where students solved tasks that required CMR with peer-
guided AR. Schoenfeld [37] states that social interaction is central to individual learning.
The present study shows that social interaction is not sufficient on its own. Furthermore the
study’s results indicate that when students work informally together and receive help from
each other, it might lead to decreased opportunities to learn CMR. The reason for these
decreased opportunities could be attributed to the fact that tasks that are only solvable using
CMR are instead solved by using peer-guided AR. Research by Fuchs et al. [38] shows
that students working together must be concisely organized, otherwise the explanations the
students receive from each other tend to be algorithmic rather than of a conceptual nature.
The results from the present study suggest that working together during task-solving has
to be organized in such a manner that opportunities to learn CMR are provided. Previous
research on cooperative learning has established a clear connection between the use of
elaborated explanation, elaborated help-seeking and learning.[39–42] Empirical studies
by Webb and Mastergeorge [42] suggest that a student seeking help must (1) ask precise
questions, (2) be persistent in asking until she/he understands the explanation, and (3) apply
the explanation given to the task at hand. The help-giver must give an opportunity for the
help-seeker to solve the task herself or himself.
On none of the occasions where a teacher helped a student did the teacher’s assistance
lead to the use of CMR in solving the task. Therefore, the way in which the teacher helps
a student in his or her task-solving is of great importance to how well the student learns to
use task-solving as a means to develop CMR. In this regard, Stein et al. [19] acknowledge
that a teacher using wrong help-giving strategies will lead to a decline in task depth and
usefulness as an opportunity to learn. This manner of help-giving might instead fuel the
use of an algorithmic manner in solving tasks. Working on a task appropriately can keep
the intended goals of the task, rather than trivializing them. According to Stein et al., the
teacher has a central role in maintaining tasks at a high level of cognitive demand. This
can be accomplished by (1) scaffolding the student’s thinking, (2) modelling high level
550 J. Sidenvall et al.

thinking, (3) pursuing explanations, (4) selecting tasks that build on prior knowledge, (5)
drawing on frequently conceptual connections, and (6) providing sufficient time to explore
the task and the mathematics connected to it.

7.3. The textbook and the opportunities to develop CMR


The data suggest that the textbook theory and worked examples are hardly used by students
when solving textbook tasks. This might be seen as surprising, especially since almost
all of the tasks that were incorrectly solved had an algorithmic template presented in the
textbook. In Lithner’s [10] study, which examined university students’ reasoning when
solving textbook tasks, the use of the textbooks’ theory sections and worked examples
were used frequently. A reason that the results of the present study do not harmonize with
the results from Lithner [10] might be the different categories of students, and that the
students worked alone as well as had no access to teacher- or peer-guided AR in Lithner’s
[10] study. Johansson’s [43] study on Swedish lower secondary school classes indicates that
textbook tasks are central in the teacher’s interaction when helping students during their task-
solving. Haggarty and Pepin [44] have reported differences in how common mathematics
textbooks are composed and how they are used by lower secondary school classes in French,
German and British classrooms. The study revealed that French textbooks communicated
mathematics that was comprehensive and presented in situations and contexts that were
cognitively challenging. Students in France were encouraged by their teachers to use the
book as a resource in their learning. Despite Lithner’s,[10] Johansson’s [43] and Haggarty
and Pepin’s studies little research has studied students’ use of textbooks.[16,45] The present
study has not aimed primarily at studying students’ use of the textbook per se, but aspects
of the findings could contribute some insights into how the textbook is used in a certain
context. Herein it is pointed out that there might be a need for a critical view on how a
textbook should be composed and used by teachers and students in order to assist students
as much as possible in their mathematics learning.
The students’ task-solving in the present study was answer oriented and the answer
section was frequently consulted. This has also previously been established in a Swedish
context.[12,13] Data from the present study provide evidence that the textbooks’ answer
sections were used as an integral part of students’ task-solving. It was common that the
answer in the answer section was used as a means of finding an algorithm that can lead to a
correct answer. The results from this study suggest that the way the answer section is used
by the students does not scaffold the development of CMR.

7.4. Further research


The findings of this study demonstrate that students get few opportunities to learn CMR.
Further research on how this situation can be improved is required. Since students hardly
utilize the theory or worked example sections of current textbooks, one avenue would
be to investigate how a textbook should communicate, be most meaningfully structured,
and used for promoting students’ acquisition of CMR. The study has also shown that the
answer sections within textbooks have an impact on task-solving. Hence, it would be of
importance to examine how the answer section should be constructed and used to assist
students’ development of CMR.
International Journal of Mathematical Education in Science and Technology 551

Disclosure statement
No potential conflict of interest was reported by the authors.

References
[1] National Council of Teachers of Mathematics. Principles and standards for school mathematics.
Reston (VA): The Council Teachers of Mathematics; 2000.
[2] Niss M. Mathematical competencies and the learning of mathematics: the Danish KOM project.
Third Mediterranean Conference on Mathematics Education; 2003 January 3–5; Athens,
Greece. p. 115–124.
[3] Ross KA. Doing and proving: the place of algorithms and proofs in school mathematics. Am
Math Mon. 1998;105(3):252–255.
[4] Hiebert J. What research says about the NCTM standards. In: Kilpatrick J, Martin WG, Schifter
D, editors. A research companion to principles and standards for school mathematics. Reston
(VA): National Council of Teachers of Mathematics; 2003. p. 5–23.
[5] Schoenfeld AH. Mathematical problem solving. Orlando (FL): Academic Press; 1985.
[6] Schmidt WH, McKnight CC, Houang RT, Wang H, Wiley DE, Wolfe RG. Why schools matter:
a cross-national comparison of curriculum and learning. San Francisco (CA): Jossey-Bass;
2001.
[7] Bergqvist T, Lithner J, Sumpter L. Upper secondary students’ task reasoning. Int J Math Educ
Sci Technol. 2008;39(1):1–12.
[8] Lithner J. Mathematical reasoning and familiar procedures. Int J Math Educ Sci Technol.
2000;31(1):83–95.
[9] Lithner J. Mathematical reasoning in task solving. Educ Stud Math. 2000;41(2):165–190.
[10] Lithner J. Students’ mathematical reasoning in university textbook exercises. Educ Stud Math.
2003;52(1):29–55.
[11] Doyle W. Academic work. Rev Educ Res. 1983;53(2):159–199.
[12] Swedish National Agency for Education (Skolverket). Lusten att lära–Med fokus på matematik:
Nationella kvalitetsgranskningar 2001–2002 [The desire to learn with a focus on mathemat-
ics: national quality audits 2001–2002] (No. 221). Stockholm: Swedish National Agency for
Education; 2003. Swedish.
[13] Swedish Schools Inspectorate (Skolinspektionen). Undervisningen i matematik i gym-
nasieskolan [Mathematics education in upper secondary school] (No. 2010:13). Stockholm:
Swedish Schools Inspectorate; 2010. Swedish.
[14] Johansson M. Teaching mathematics with textbooks: a classroom and curricular perspective
[doctoral dissertation thesis]. Luleå: Luleå University of Technology; 2006.
[15] Kajander A, Lovric M. Mathematics textbooks and their potential role in supporting miscon-
ceptions. Int J Math Educ Sci Technol. 2009;40(2):173–181.
[16] Love E, Pimm D. ‘This is so’: a text on texts. In: Bishop AJ, editor. International handbook of
mathematics education. P. 1. Dordrecht: Kluwer; 1996. p. 371–409.
[17] Törnroos J. Mathematics textbooks, opportunity to learn and student achievement. Stud Educ
Eval. 2005;31(4):315–327.
[18] Valverde GA, Bianchi LJ, Wolfe RG, Schmidt WH, Houang RT. In: Valverde GA, editor.
According to the book: using TIMSS to investigate the translation of policy into practice
through the world of textbooks. Dordrecht: Kluwer Academic Publishers; 2002.
[19] Stein M, Remillard JT, Smith M. How curriculum influences student learning. In: Lester FK,
editor. Second handbook of research on mathematics teaching and learning: a project of the
National Council of Teachers of Mathematics. Charlotte (NC): Information Age Publishing;
2007. p. 319–369.
[20] Jäder J, Lithner J, Sidenvall J. An international comparison of reasoning requirements in
mathematics textbooks. Forthcoming 2014.
[21] Ball D, Bass H. Making mathematics reasonable in school. In: Kilpatrick J, Martin WG,
Schifter D, editors. A research companion to principles and standards for school mathematics.
Reston (VA): National Council of Teachers of Mathematics; 2003. p. 27–44.
[22] Silver E. Fostering creativity through instruction rich in mathematical problem solving and
problem posing. ZDM. 1997;29(3):75–80.
552 J. Sidenvall et al.

[23] Swedish National Agency for Education (Skolverket). Läroplan, examensmål och gymnasiege-
mensamma ämnen för gymnasieskola 2011 [Curriculum, exam goals and core subjects in upper
secondary school 2011]. Stockholm: Fritzes; 2011. Swedish.
[24] Lithner J. A research framework for creative and imitative reasoning. Educ Stud Math.
2008;67(3):255–276.
[25] Bergqvist T., Lithner J. Mathematical reasoning in teachers’ presentations. J Math Behav.
2012;31(2):252–269.
[26] Boesen J, Lithner J, Palm T. The relation between types of assessment tasks and the mathemat-
ical reasoning students use. Educ Stud Math. 2010;75(1):89–105.
[27] Lithner J. Mathematical reasoning in calculus textbook exercises. J Math Behav. 2004;23(4):
405–427.
[28] Långström P, Lithner J. Svenska gymnasieelevers matematiska resonemang och hjälpprocesser i
klassrumsmiljö [Swedish upper secondary students’ mathematical reasoning and aid processes in
a classroom environment]. Matematikelevers strategier för fel- och hjälpsökning [Mathemat- ics
students’ strategies for error and help seeking] [licentiate thesis]. Umeå: Umeå University;
2008. p. 51–102. Swedish.
[29] Palm T, Boesen J, Lithner J. Mathematical reasoning requirements in Swedish upper secondary
level assessments. Math Thinking Learn. 2011;13(3):221–246.
[30] Brousseau G, Balacheff N. Theory of didactical situations in mathematics [electronic resource],
1970-1990. Dordrecht: Kluwer Academic Publishers; 1997.
[31] Szabo A. Matematik origo. 1b [Mathematics origo. 1b]. 2nd ed. Stockholm: Bonnier Utbild-
ning; 2011. Swedish.
[32] Alfredsson L. Matematik 5000. kurs 2c blå, lärobok [Mathematics 5000. Course 2c blue,
textbook]. 1st ed. Stockholm: Natur & Kultur; 2011. Swedish.
[33] Szabo A. Matematik origo. 2c [Mathematics origo. 2c]. 2nd ed. Stockholm: Sanoma Utbildning;
2012. Swedish.
[34] Alfredsson L, Erixon P, Heikne H. Matematik 5000. kurs 1a gul, lärobok [Mathematics 5000.
Course 1a yellow, textbook]. 1st ed. Stockholm: Natur & Kultur; 2011. Swedish.
[35] Hiebert J, Grouws DA. The effects of classroom mathematics teaching on students’ learning.
In: Lester FK, editor. Second handbook of research on mathematics teaching and learning: a
project of the National Council of Teachers of Mathematics. Charlotte (NC): Information Age
Publishing; 2007. p. 371–404.
[36] Hiebert J, Carpenter TP, Fennema E, Fuson K, Human P, Murray H, Oliver A, Wearne D.
Problem solving as a basis for reform in curriculum and instruction: the case of mathematics.
Educ Res. 1996;25(4):12–21.
[37] Schoenfeld AH. Ideas in the air: speculations on small group learning, environmental and
cultural influences on cognition, and epistemology. Int J Educ Res. 1989;13(1):71–88.
[38] Fuchs LS, Fuchs D, Hamlett CL, Phillips NB, Karns K, Dutka S. Enhancing students’ helping
behavior during peer-mediated instruction with conceptual mathematical explanations. Elem
School J. 1997;97(3):223–249.
[39] Hiebert J, Wearne D. Instructional tasks, classroom discourse, and students’ learning in second-
grade arithmetic. Am Educ Res J. 1993;30(2):393–425.
[40] Webb NM. Peer interaction and learning in small groups. Int J Educ Res. 1989;13(1):21–39.
[41] Webb NM. Task-related verbal interaction and mathematics learning in small groups. J Res
Math Educ. 1991;22(5):366–389.
[42] Webb NM, Mastergeorge A. Chapter 4: promoting effective helping behavior in peer-directed
groups. Int J Educ Res. 2003;39:73–97.
[43] Johansson M. Mathematical meaning making and textbook tasks. Learn Math. 2007;45(1):45–
51.
[44] Haggarty L, Pepin B. An investigation of mathematics textbooks and their use in English,
French, and German classrooms: who gets an opportunity to learn what? Br Educ Res J.
2002;28(4):567–590.
[45] Rezat S. Interactions of teachers’ and students’ use of mathematics textbooks. In: Gueudet G,
Pipin B, Trouche L, editors. From text to ‘lived’ resources: mathematics curriculum materials
and teacher development [electronic resource]. New York (NY): Springer; 2012. p. 231–245.

You might also like