Professional Documents
Culture Documents
Students' Reasoning in Mathematics Textbook Task-Solving
Students' Reasoning in Mathematics Textbook Task-Solving
To cite this article: Johan Sidenvall, Johan Lithner & Jonas Jäder (2015) Students’ reasoning in mathematics
textbook task-solving, International Journal of Mathematical Education in Science and Technology, 46:4, 533-
552, DOI: 10.1080/0020739X.2014.992986
a Department of Social and Welfare Studies, Linköping University, Sweden; b Umeå Mathematics
Education Research Centre, Umeå University, Sweden
(Received 21 July 2014)
1. Introduction
Reasoning is a fundamental aspect of mathematics.[1–3] Research has shown that there is
too much emphasis on rote learning and superficial reasoning in mathematical education.[4]
According to Hiebert [4] ‘One of the most reliable findings from research on teaching and
learning is that students learn what they are given opportunities to learn’ (p.10). To learn
both procedures and mathematically founded reasoning students must practise how to
solve both routine and non-routine tasks.[5] The textbook is an important source for giving
students an opportunity to learn.[6] This study does not primarily examine the textbook
itself as an opportunity to learn, nor does it focus on what is actually being learned. The
study focuses on students’ solving textbook tasks related to their reasoning; examining how
the type of reasoning affects the ability to solve textbook tasks. Rote learning is related
to students’ tendency to use sometimes inefficient and mathematically superficial imitative
strategies rather than creating their own solutions through reasoning.[7–10] Tasks are a
cornerstone of students’ work with mathematics and according to Doyle [11] ‘influence
students by directing their attention to particular aspects of content and specifying ways of
processing information’ (p. 161). The overall purpose of this study is to examine to what
extent students create their own solutions through mathematically founded reasoning to
solve textbook tasks, and to relate the students’ reasoning with their ability in completing
the tasks.
∗
Corresponding author. Email: johan.sidenvall@liu.se
O
C 2014 Taylor & Francis
534 J. Sidenvall et al.
This paper has the structure as follows. In the Background the importance of the textbook
and its tasks are shown upon, followed by a background of the reasoning competence. In
the Research framework section the research framework is presented. The Methods section
includes description of both data collection and data analysis. Results are presented in the
Results section. The paper ends with a Discussion of the results.
2. Background
2.1. The textbook and textbook tasks
Textbooks have been shown to have a great impact on classroom work and to form the
backbone of mathematical teaching both in Sweden [12–14] and internationally.[6,15–18]
The textbook tasks that students engage in largely determine what they learn about
mathematics and how they learn it.[19] By studying data from the Third International
Mathematics and Science Study, Valverde et al. [18] found that a majority of the textbooks
for 9- and 13-year-olds and students in the final year of secondary school mainly focused on
practising routine skills. A low proportion of the textbooks’ content aimed at investigation,
problem-solving or making mathematical generalizations. An international textbook task
analysis [20] showed that around 75% of the tasks in a common Swedish secondary
school mathematics textbook were solvable by mimicking previously presented algorithmic
templates. Twenty-five per cent of the tasks required at least some mathematically founded
reasoning to be solved. The study also showed that a majority of the tasks requiring
mathematically founded reasoning were classified by the textbook authors as more difficult
tasks.
2.2. Reasoning
Ball and Bass [21] state that ‘mathematical reasoning is no less than a basic skill’ (p.28).
In the following, a broad definition of reasoning is applied. This means that reasoning can
be found in all levels of mathematical understanding. In the literature ‘reasoning’ is often
defined as a skill of high deductive-logical quality.[22] This is not the case in this study;
our overall assumption is that mathematically founded reasoning can be found and used
at all levels of difficulty in solving mathematical tasks. This is also in line with national
curricula and research concerning what competences should be developed.[1,2,4,23] The
definition of reasoning used here is: ‘reasoning is the line of thought adopted to produce
assertions and reach conclusions in task-solving. It is not necessarily based on formal logic,
thus not restricted to proof, and may even be incorrect as long as there are some kind of
sensible (to the reasoner) reasons backing it’.[24,p.257] Empirical studies [7,10,25–29]
using this wider definition of reasoning lead to a division into imitative reasoning and
creative mathematically founded reasoning (CMR). To use imitative reasoning is to solve
tasks by methods that are known or provided by someone else (e.g. a teacher or a textbook).
To use CMR is to use a new, mathematically founded and intrinsic line of argument (see
Research framework section).[24] When opportunities to learn CMR are given it will
also lead to opportunities to develop other related important mathematical competences:
problem-solving and conceptual understanding.[24] Such competences are not developed
by imitative reasoning alone.[4,5,24]
A framework of analysis will be used to categorize the reasoning used by students to
solve a task, and whether or not they succeed. Another framework of analysis will be used
International Journal of Mathematical Education in Science and Technology 535
to categorize the anticipated reasoning required for a task according to the level of the
mathematical course. Both frameworks use the theoretical framework by Lithner.[24]
• What is the relationship between the reasoning required and the reasoning used when
solving textbook tasks?
• What is the relationship between the reasoning used and the rate of correct solutions
when solving textbook tasks?
4. Research framework
This study is based on a research framework formulated by Lithner [24] in which reasoning
in task-solving is seen as a product of a student’s thinking process. Reasoning is seen as
data in empirical studies, which can be documented through text, graphical representation,
video recordings, etc., and these data are seen as traces of the students’ thoughts.[10]
The thinking processes are dependent on the students’ mathematical competences. In turn,
these competences are formed by the sociocultural milieu.[24] The task-solving process
according to the framework comprises the following four steps:
4.1. CMR
Imitative reasoning is often suitable for solving routine tasks. The six different types of
imitative reasoning are as follows.
Memorized reasoning is characterized by the strategy of recalling a complete answer
(e.g. a proof) and the implementation of this choice consists of merely writing it down.
Many school tasks normally require calculations where it is more appropriate to recall
an algorithm rather than the answer. This provides a basis for the other main type of
imitative reasoning, algorithmic reasoning (AR). In this regard, ‘An algorithm is a finite
sequence of executable instructions which allows one to find a definite result for a given
536 J. Sidenvall et al.
class of problems’.[30,p.129] The nth transition does not depend on any circumstance
unforeseen in the (n − 1)th transition; nor on finding new information, any new decision,
any interpretation, and thus on any meaning that one could attribute to them. Thus, an
algorithm can be determined in advance. AR is characterized by a strategy of searching
for or recalling a certain algorithm that is thought to be suitable and then implementing
this algorithm in task-solving.[24] ‘The remaining parts of the strategy implementation are
trivial for the reasoner, only a careless mistake can prevent a correct answer from being
reached. How to identify a suitable algorithm is fundamental, and the rest is relatively
straightforward’.[29,p.225] AR can be further divided into five subgroups in which an
algorithm is obtained: familiar AR,delimiting AR, and three types of guided AR (peer-
guided AR, teacher-guided AR andtext-guided AR).[7,24] The different AR types are as
follows.
Familiar AR is connected to a strategy of searching for familiar (perhaps superficial)
clues or leads that in turn trigger a strategy choice of which algorithm to choose. A student
might use familiar AR when solving an equation where the student knows the corresponding
algorithms.
Delimited AR may be employed when a student cannot connect a familiar algorithm
to the task. Instead, an algorithm is chosen from a set that is delimited by the student
through the algorithms’ surface relations to the task. If the implementation does not lead
to a reasonable conclusion (to the reasoner) the reasoning sequence is simply terminated
and another algorithm may be chosen from the set. An example from Bergqvist et al. [7]
illustrates a student’s use of delimited AR: in Sally’s attempt to solve ‘Find the largest and
smallest values of the function y 7=3x x+2 on the interval [1, 5]’. She differentiates
y, solves y t (x) = 0, (x = 1, 5) and evaluates y (1, 5) 9, 25. She hesitates: ‘I think that
=
I should have got two values, and− I don’t know why I didn’t, what have I done wrong’.
She abandons this method without reflection since it did not produce an expected answer.
Instead she moves on to two different methods using a graphic calculator. These subsequent
attempts are also abandoned since they do not produce an expected answer. Sally finally
makes the incorrect strategy choice to solve 7 +3x − x2 0 and obtains two values,
x1 ≈ 4, 54 and x2 ≈ = solution even though this does
4, 54, which she regards as a correct
not− solve the task.
Peer-guided AR occurs when the reasoner is guided through the task by a peer who
describes the solution procedure. The strategy choice is to follow the peer’s guidance and
the implementation is in simply executing the algorithm without verificative argumentation.
Teacher-guided AR is similar to peer-guided AR with the difference that the teacher serves
as a guide.
Text-guided AR is used when, during his/her strategy choice, a student identifies surface
similarities between the task and an example, definition, theorem, rule, or some other
situation in a text source. The algorithm is implemented without verificative argumentation.
• Novelty. A new (to the reasoner) reasoning sequence is created, or a forgotten one is
re-created.
International Journal of Mathematical Education in Science and Technology 537
Creative
mathematically
founded reasoning Imitative reasoning
Memorized Algorithmic
Global Local reasoning reasoning, AR
• Plausibility. There are arguments supporting the strategy choice and/or strategy im-
plementation motivating why the conclusions are true or plausible.
• Mathematical foundation. The arguments are anchored in the intrinsic mathematical
properties of the components involved in the reasoning.
Imitative reasoning contains no CMR. Local CMR always contains considerable parts
of imitative reasoning (e.g. recalling facts and using algorithmic subprocedures) and minor
parts of CMR. Global CMR may contain large parts of imitative reasoning but always
contains significant parts of CMR.
Table 1 shows schematically CMR and imitative reasoning with their subcategories.
5. Methods
5.1. Method of data collection
Data were collected from two Swedish upper secondary schools of about average size.
Upper secondary school is not a compulsory part of the Swedish school system, but 98%
of students from the compulsory lower secondary school continue to upper secondary
school. The overall data collection method adopted in this study is based on Långström and
Lithner’s [28] methods, which enables collection of data during normal classroom work.
Data were collected from two classes from the least mathematically intense programme,
one class from the intermediately intense programme and two classes from the most mathe-
matically intense programme. This was done to cater for the case that students participating
in different tracks might have different mathematical competences. All students attended
538 J. Sidenvall et al.
the first year of upper secondary school, equivalent to the 10th year of schooling. The
teachers were professionally trained and had at least 10 years of teaching experience.
The lessons (50–60 minutes) were alike in structure and execution. A typical lesson
commenced with a teacher’s presentation (10–25 minutes) followed by student work with
textbook tasks (30–45 minutes). The teacher did direct the students to a specific section of
the book. It was to a large extent up to the students to decide what tasks and at what level
of difficulty to do in the directed section. The teacher moved around the classroom helping
students in their task-solving. Following the teachers’ presentations, the video recordings
began. Two cameras were placed in the classroom. During the first minutes of the students’
work with textbook tasks, two groups of students were chosen for the data collection. A
camera and microphone were placed in close vicinity to each group. The criteria used when
choosing groups of students were that they seemed to be mathematically active and that a
mathematical dialogue was in progress. The students’ notebooks and textbooks were also a
part of the data. Field notes were taken throughout the lessons. The work from seven groups
of students from four classes was analysed. Six groups consisted of 2 students and one group
consisted of 3 students, giving a total of 15 students. A class from the least mathematically
intense track was excluded from the analysis due to the lack of mathematical task-solving
observed in the groups.
5.2.1.1. Example of data and analysis of familiar AR. Data: Adam solves the task below
(Figure 1) straightforwardly by identifying the algorithm connected to a familiar clue or
lead, here as a geometrical figure where the centre angle is given and the boundary angle
is sought. ‘It was like this, I think that it [angle v] is half [of 190◦ angle]. Isn’t that right?’
‘Calculate angle v’ [31,p.159] (Figure 1).
Analysis: Since Adam states: ‘I think that it [angle v] is half [of 190◦ angle]’ it is clear
that he knows the inscribed angle theorem and thus an algorithmic solution method. If
he had not already been familiar with the theorem he would have to construct it during
the task-solving session, which is very difficult. This would take much longer time than
what he used and there would be some traces of this construction in the data, for example
International Journal of Mathematical Education in Science and Technology 539
drawings and probably also some explicit arguments justifying the construction. Therefore
the reasoning is categorized as familiar AR.
5.2.1.2. Example of data and analysis of delimited AR. Data: Molly is trying to solve the
task ‘What are the zeroes of the function = y x 2 −6x 5?’.[32,p.108] Molly chooses a
suitable method, does the following calculations, but + makes the careless mistake of adding
and subtracting 4 instead of 2 in the final step:
√
y = x2 − 6x + 5, y = 0, 0 = x2 − 6x + 5, x = 3 ± 9 − 5, x1 = 7, x2 = −1
(1)
Molly then compares her result with the answer in the answer section. She sees that
she has produced an incorrect answer. Molly then tries, without any further reflection, an
algorithm that the teacher has presented on the blackboard to calculate a function’s value.
This algorithm can be used to check if given x-values are zeros of the function or not,
but it is not suitable for finding such values (i.e. to solve the task). However, she uses the
algorithm presented by the teacher in the hope that it might solve the task. She arrives at
the following result:
There is no indication of the student’s reflection that the calculated x-values should give
y = 0 and not the two function values as in (2) if they were solutions to the task. Molly’s
540 J. Sidenvall et al.
calculation (2), which she sees as an answer to the task, she is confused and she seems to
lose track of what she is doing when the answer does not match the answer in the answer
section. She states: ‘Well, this did not go so well’ and abandons the task.
Analysis: The first calculation (1) is categorized as familiar AR since Molly chooses a
known algorithm, although she makes a computational error. Her second calculation (2) is
categorized as delimited AR. The reason for this classification is that instead of analysing
the outcome of the chosen algorithm (1) she merely abandoned it without reflection and
continued trying to solve the task by using a new algorithm (2) that yields two numbers as
an answer. She hopes that this algorithm might solve the task. When the first algorithm does
not provide her with an answer that is the same as in the answer section, the second algorithm
is selected on the basis that it is considered somehow connected to finding function values,
but Molly does not understand how or why. The strategy implementation is then carried out
by following the algorithm. No verificative argumentation is required.
5.2.1.3. Example of data and analysis of text-guided AR. Data: The task below is solved
by David by searching in the textbook for a suitable algorithm. The bisector theorem is
found and used to solve the task. ‘CD is a bisector. Determine AD’ [33,p.160] (Figure 2).
Analysis: The strategy choice concerns identifying surface similarities between the task
and the theorem. In this case the figure beside the theorem is similar to the figure in the
task. The algorithm is implemented without verificative argumentation.
5.2.1.4. Example of data and analysis of peer-guided AR. Data: Simon is trying to solve
task (d) below.
When Jesper buys a TV for 15000 crowns he is offered to a yearly payment plan over five years
instead of paying cash. Every year a fifth of the debt is paid off. The interest rate is 5.75%.
(a) How much must Jesper pay to the bank the first year?
(b) How much must Jesper pay to the bank the second year?
(c) How much more will the TV cost if it is paid through the payment plan compared to
paying in cash.
International Journal of Mathematical Education in Science and Technology 541
(d) How much more expensive (in percentage) is the TV when it is paid off in instalments?
[34,p.107]
Simon seems to know that he must compare the cash price with the sum of the cash
price plus interest, but does not know how to make the comparison himself. He then turns
to his peer and asks: ‘Or should I take the new [value] divided by the old [value]?’ The
peer then responds with: ‘The new by the old. That divided by that [pointing in Simon’s
notebook]’. Simon then writes the correct computation without asking for an explanation
as to why this algorithm leads to the correct answer.
Analysis: The strategy choice that is problematic for Simon was already made by a peer.
The strategy implementation followed the guidance and execution of the remaining routine
transformations.
5.2.1.5. Example of data and analysis of teacher-guided AR. Data: Adam is trying to
solve the task ‘Determine angle AOB in [Figure 3]’.[33,p.159] Adam asks the teacher for
help. The teacher reads the inscribed angle theorem to Adam and states that the centre angle
is always double the angle on the circumference. Adam completes the task.
Analysis: The strategy choice is to ask the teacher, which leads to a strategy implemen-
tation guided by the teacher. No predicative or verificative arguments are visible.
5.2.1.6. Example of data and analysis of (local) CMR. Data: Lars is working on the task
‘Determine x’ [31,p.265] (Figure 4) with his peer John. Lars and John have the following
dialogue:
542 J. Sidenvall et al.
step in the solution individually, adding the angles and dividing by the number of x’s, the
solution can be considered as comprising of known algorithms at a year 10 mathematics
level. This reasoning sequence is categorized as local CMR because Lars uses CMR
to solve the task, although the solving process contains large portions of AR. If larger
part of a solution would consist of CMR, it could have potentially been categorized as
global CMR.
5.2.2.1. Example of data and analysis. Data: The task example (see ‘Example of data
and analysis of peer-guided AR’, task (a)) is about a TV that is bought via a payment plan.
Analysis: (1) Possible algorithms are identified for the tasks; e.g. the algorithm for
solving the task could be 15,000
5 = 3000, 1.0575 × 3000 3172.5. (2) On the page before
the page where the task appears a worked example=containing the same type of task question,
information is available and showing the same algorithms that are needed to solve the task
with the TV. The only difference is that the worked example is in the context of buying a
car instead of a TV. (3) The task is categorized as only requiring text-guided AR because
of the close similarities with the worked example.
5.2.3. Validity
The data analysis, categorization of the reasoning used and the reasoning required were
done by two coders to establish interrater reliability. The two coders both categorized all
the textbook tasks; the required reasoning. They reached an agreement on 37 of the 39
tasks (95%). The categorization of the reasoning used was first done by one coder. The
categorizations that were considered borderline-categorizations (5/122) by the first coder
were coded by a second coder. In four of these five cases the two coders had done the same
categorization. The discrepancies in the categorization of both used and required reasoning
were resolved through discussion.
6. Results
The textbook task-solving performed by seven groups of students was analysed. Overall,
the students worked on 86 textbook tasks that were divided into 122 subtasks, 106 of these
subtasks contained sufficient data in the form of oral communication to be analysed.
The textbook tasks were labelled according to how difficult the textbook authors con-
sidered the tasks to be. All the textbooks pitched the tasks at three levels of difficulty.
Data showed that students rarely attempted the more difficult tasks. Eighty-four per cent
of the encountered tasks belonged to the easiest level of difficulty. Sixteen per cent of the
attempted tasks belonged to the intermediate difficulty. No attempts were made to solve
544 J. Sidenvall et al.
the most difficult tasks. Since the students mainly worked on the least difficult tasks this
indicates that the cognitive demand of the tasks attempted can be assumed not to be at
the highest level. In addition none of the students worked on tasks belonging to activity
sections (headings containing e.g. ‘Activity’, or ‘Explore’).
Table 2. Frequency of occurrences of reasoning required and reasoning used in solving textbook
tasks.
Reasoning used
CMR AR
Reasoning GCMR 1 0 2 0 1 0 0 0 4
required LCMR 0 5 0 0 4 0 0 1 10
GAR 0 2 35 2 20 8 2 3 72
Total 1 7 37 2 25 8 2 4 86
International Journal of Mathematical Education in Science and Technology 545
an expected result since the students had worked with the used algorithms previously in
their textbook and/or the teacher had presented the method for the students. One reason for
using familiar AR was that the student would recognize the task type and could then link
the task to a known algorithm using a clue or a lead strategy. An example of this is where
Adam solved a task via the inscribed angle theorem (see Section 5.2.1.1.).
6.2. The relation between reasoning used and rate of correct solutions
Eighty-two per cent of the tasks were correctly solved. When using global CMR or local
CMR, 100% of the tasks were solved correctly, while 80% of the tasks were solved correctly
when using AR. Table 3 shows that all tasks solved through global CMR, local CMR and
text-guided AR yielded a correct solution. However, these three reasoning types rarely
occurred.
To further explore the relationship between reasoning used and the rate of correct task
solutions a more fine-grained analysis was carried out by examining the relation between the
subtask reasoning used and the rate of correct solutions when solving a subtask. Seventy-two
per cent (76/106) of the subtasks were correctly solved. The most common way of trying to
solve a task was via AR, which was used in 92% (98/106) of the subtask solution attempts.
Memorized reasoning was only used once (1/106). No further analysis was carried out
concerning memorized reasoning because it rarely occurred. The most frequent reasoning
type was familiar AR, which was observed in 44% (47/106) of all the task-solving attempts.
Peer-guided AR was the second most common type of reasoning and accounted for around
a quarter (27/106) of the solution attempts. Teacher-guided AR was the third most common
reasoning type and emerged in 11% of the task-solving attempts (12/106). Table 4 shows
the proportion of correct solutions for each of the reasoning types used.
One possible explanation why tasks were correctly solved when using CMR (100%,
7/106) could be that the limited depth of the task made the CMR part very minor. Another
possible explanation why tasks were correctly solved when using CMR was that the student
engaged himself or herself in some sort of struggle.[35] By ‘struggle’, Hiebert and Grouws
[35] mean ‘that students expend effort to make sense of mathematics, to figure something
out that is not immediately apparent’ (p.387). This struggle might be exemplified in Lars’s
task solution presented above (see Section 5.2.1.6.). Lars could not use peer-guided AR
since his peer, John, was seeking help from Lars. Lars also did not use the answer section,
the textbook or the teacher’s assistance to reach a solution. Rather it was by solving a task
that was conceptually within reach and with mathematical ideas that were understandable
but not yet well formed [36] that led to the use of CMR and a correct solution. Since
the students in the study rarely used CMR, especially global CMR, it was difficult to
hypothesize further reasons for using CMR.
The use of familiar AR that leads to correct solutions was connected to if the student
could identify what algorithm to use by searching for familiar clues or leads in the task
100% (1/1) 100% (7/7) 81% (30/37) 0% (0/2) 92% (23/25) 63% (5/8) 100% (2/2) 50% (2/4)
∗Categorized as AR strategy but analysis has not been able to clarify what type of AR.
546
Table 4. For each reasoning type the entries display the proportion of a correct solution for a subtask.
J. Sidenvall et al.
Global Local Memorized reasoning AR
100% (1/1) 100% (6/6) 100% (1/1) 68% (32/47) 0% (0/3) 0% (0/3) 90% (26/29) 46% (6/13) 67% (2/3)
100% (7/7) 70% (69/99) 72% (76/106)
∗In three cases it was not possible to distinguish if the reasoning used was familiar AR or delimited AR.
International Journal of Mathematical Education in Science and Technology 547
that in turn triggered a strategy choice on which algorithm to choose. When Adam solved
a task by using the inscribed angle theorem (Section 5.2.1.1.), he identified and used a
familiar algorithm, in a straightforward manner and without verificative arguments. Besides
checking the answer with the answer section, explicit verifications were rare in the cases
where the students believed that they had identified and used the correct algorithm (i.e.
using familiar AR). When the strategy choice was based on surface properties it sometimes
led to an incorrect algorithm choice, which naturally resulted in an incorrect answer. This
superficial way of solving tasks was the main reason for incorrect solutions when using
familiar AR.
The reason why almost all subtasks that were approached using peer-guided AR were
correctly solved was that the peers guiding the students had in most cases already correctly
solved the task themselves. Usually whole solutions, otherwise essential parts of a solution
were presented to the student under guidance. This is illustrated by Simon’s solution of how
much more it costs to pay for the TV via instalments (Section 5.2.1.1.). In the majority of
such cases, the student receiving the information did not ask for explanations, nor did the
peer giving the guidance provide any explanations. One may note that this can be paralleled
to the didactical contract, introduced by Brousseau [30], between a teacher and her student.
The didactical contract is an (mainly implicit) agreement between the teacher and the
student determining their respective roles in the mathematics classroom. The students of
this study are acting as if there was an (implicit) agreement that the guide does not have
to provide any justifications for the choices and claims, and that the guided student does
not ask for such justifications. In some peer-guided AR the students carried out a simple
but not mathematically complete verification of the answer. The verifications that did take
place often concluded that the answer seemed reasonable in relation to the task context.
Text-guided AR was seldom used (3/106). It was successful in two cases where the
students could connect the task to a presented theorem in the textbook via surface similari-
ties. Text-guided AR might be more common among students who do not work together in
task-solving.[10]
None of the instances where a student requested help from a teacher led to CMR by
the student. Instead teacher assistance often resulted in the use of teacher-guided AR by
the student. Teacher-guided AR was one of the reasoning types that led to least correct
solutions. Seven of the 13 subtasks that used teacher-guided AR were incorrectly solved.
One identified reason was that the teacher did not provide adequate assistance for the
students to make progress during their task-solving. For example, the teacher thought that
the student needed help to complete one algorithm, while the student actually needed
help with a different algorithm or probably with understanding a concept. The reason for
generating correct solutions when using teacher-guided AR was that the teacher led the
student through the task.
When a student tried to solve a task by using delimited AR, this, by definition, involved
using the surface properties of the task. To try to solve a task using delimited AR the student
may have used a number of algorithms at hand that are delimited by the specific situation.
However, it would be more or less a matter of chance that the student would have used the
correct algorithm. This was why delimited AR frequently led to incorrect solutions. No
task in the study was correctly solved using delimited AR.
1 30 000
incorrect solutions where students did not turn to the textbook to seek assistance. Twenty-
nine of 106 subtasks were incorrectly solved. In almost all of these cases (23/29) there was
a corresponding solved example that had an algorithm that could have been applied to solve
the task (i.e. text-guided AR information was readily available, but not used).
Students focused clearly on finding an algorithmic method to produce an answer, rather
than on understanding the task and solution method. This was emphasized in the frequent
use of textbooks’ answer sections, which played a paramount role in students’ task-solving.
The answers in the answer section were used as a hint to progress in solving the task, or
as a means of verifying an answer. The answers to a task often assisted a student to make
task progress using an AR strategy. Ann worked on the following task: ‘A loan of 30,000
crowns will be amortized with three equally large amounts over three years. The interest
rate is 5.20%. Complete the table’. See Figure 5.
Ann first attempted to complete the column ‘Remaining loan’, but could then progress
no further. Her strategy to complete the column was to consult the answer section and view
the completed table. Reading the table in the answer section led her to an algorithm of
subtracting 10,000 crowns for every year. A similar process was used for completing the
remaining columns; not knowing how to proceed, she consulted the answer section which
led to an algorithm and a solution.
7. Discussion
This study shows that there was a discrepancy between what students actually do during
their task-solving and mathematical competence of reasoning that is stated as important
by research and the Swedish national curricula.[1,2,23] The results of this study suggest
that students engage in CMR to a very limited extent or not at all while solving textbook
tasks.
thinking, (3) pursuing explanations, (4) selecting tasks that build on prior knowledge, (5)
drawing on frequently conceptual connections, and (6) providing sufficient time to explore
the task and the mathematics connected to it.
Disclosure statement
No potential conflict of interest was reported by the authors.
References
[1] National Council of Teachers of Mathematics. Principles and standards for school mathematics.
Reston (VA): The Council Teachers of Mathematics; 2000.
[2] Niss M. Mathematical competencies and the learning of mathematics: the Danish KOM project.
Third Mediterranean Conference on Mathematics Education; 2003 January 3–5; Athens,
Greece. p. 115–124.
[3] Ross KA. Doing and proving: the place of algorithms and proofs in school mathematics. Am
Math Mon. 1998;105(3):252–255.
[4] Hiebert J. What research says about the NCTM standards. In: Kilpatrick J, Martin WG, Schifter
D, editors. A research companion to principles and standards for school mathematics. Reston
(VA): National Council of Teachers of Mathematics; 2003. p. 5–23.
[5] Schoenfeld AH. Mathematical problem solving. Orlando (FL): Academic Press; 1985.
[6] Schmidt WH, McKnight CC, Houang RT, Wang H, Wiley DE, Wolfe RG. Why schools matter:
a cross-national comparison of curriculum and learning. San Francisco (CA): Jossey-Bass;
2001.
[7] Bergqvist T, Lithner J, Sumpter L. Upper secondary students’ task reasoning. Int J Math Educ
Sci Technol. 2008;39(1):1–12.
[8] Lithner J. Mathematical reasoning and familiar procedures. Int J Math Educ Sci Technol.
2000;31(1):83–95.
[9] Lithner J. Mathematical reasoning in task solving. Educ Stud Math. 2000;41(2):165–190.
[10] Lithner J. Students’ mathematical reasoning in university textbook exercises. Educ Stud Math.
2003;52(1):29–55.
[11] Doyle W. Academic work. Rev Educ Res. 1983;53(2):159–199.
[12] Swedish National Agency for Education (Skolverket). Lusten att lära–Med fokus på matematik:
Nationella kvalitetsgranskningar 2001–2002 [The desire to learn with a focus on mathemat-
ics: national quality audits 2001–2002] (No. 221). Stockholm: Swedish National Agency for
Education; 2003. Swedish.
[13] Swedish Schools Inspectorate (Skolinspektionen). Undervisningen i matematik i gym-
nasieskolan [Mathematics education in upper secondary school] (No. 2010:13). Stockholm:
Swedish Schools Inspectorate; 2010. Swedish.
[14] Johansson M. Teaching mathematics with textbooks: a classroom and curricular perspective
[doctoral dissertation thesis]. Luleå: Luleå University of Technology; 2006.
[15] Kajander A, Lovric M. Mathematics textbooks and their potential role in supporting miscon-
ceptions. Int J Math Educ Sci Technol. 2009;40(2):173–181.
[16] Love E, Pimm D. ‘This is so’: a text on texts. In: Bishop AJ, editor. International handbook of
mathematics education. P. 1. Dordrecht: Kluwer; 1996. p. 371–409.
[17] Törnroos J. Mathematics textbooks, opportunity to learn and student achievement. Stud Educ
Eval. 2005;31(4):315–327.
[18] Valverde GA, Bianchi LJ, Wolfe RG, Schmidt WH, Houang RT. In: Valverde GA, editor.
According to the book: using TIMSS to investigate the translation of policy into practice
through the world of textbooks. Dordrecht: Kluwer Academic Publishers; 2002.
[19] Stein M, Remillard JT, Smith M. How curriculum influences student learning. In: Lester FK,
editor. Second handbook of research on mathematics teaching and learning: a project of the
National Council of Teachers of Mathematics. Charlotte (NC): Information Age Publishing;
2007. p. 319–369.
[20] Jäder J, Lithner J, Sidenvall J. An international comparison of reasoning requirements in
mathematics textbooks. Forthcoming 2014.
[21] Ball D, Bass H. Making mathematics reasonable in school. In: Kilpatrick J, Martin WG,
Schifter D, editors. A research companion to principles and standards for school mathematics.
Reston (VA): National Council of Teachers of Mathematics; 2003. p. 27–44.
[22] Silver E. Fostering creativity through instruction rich in mathematical problem solving and
problem posing. ZDM. 1997;29(3):75–80.
552 J. Sidenvall et al.
[23] Swedish National Agency for Education (Skolverket). Läroplan, examensmål och gymnasiege-
mensamma ämnen för gymnasieskola 2011 [Curriculum, exam goals and core subjects in upper
secondary school 2011]. Stockholm: Fritzes; 2011. Swedish.
[24] Lithner J. A research framework for creative and imitative reasoning. Educ Stud Math.
2008;67(3):255–276.
[25] Bergqvist T., Lithner J. Mathematical reasoning in teachers’ presentations. J Math Behav.
2012;31(2):252–269.
[26] Boesen J, Lithner J, Palm T. The relation between types of assessment tasks and the mathemat-
ical reasoning students use. Educ Stud Math. 2010;75(1):89–105.
[27] Lithner J. Mathematical reasoning in calculus textbook exercises. J Math Behav. 2004;23(4):
405–427.
[28] Långström P, Lithner J. Svenska gymnasieelevers matematiska resonemang och hjälpprocesser i
klassrumsmiljö [Swedish upper secondary students’ mathematical reasoning and aid processes in
a classroom environment]. Matematikelevers strategier för fel- och hjälpsökning [Mathemat- ics
students’ strategies for error and help seeking] [licentiate thesis]. Umeå: Umeå University;
2008. p. 51–102. Swedish.
[29] Palm T, Boesen J, Lithner J. Mathematical reasoning requirements in Swedish upper secondary
level assessments. Math Thinking Learn. 2011;13(3):221–246.
[30] Brousseau G, Balacheff N. Theory of didactical situations in mathematics [electronic resource],
1970-1990. Dordrecht: Kluwer Academic Publishers; 1997.
[31] Szabo A. Matematik origo. 1b [Mathematics origo. 1b]. 2nd ed. Stockholm: Bonnier Utbild-
ning; 2011. Swedish.
[32] Alfredsson L. Matematik 5000. kurs 2c blå, lärobok [Mathematics 5000. Course 2c blue,
textbook]. 1st ed. Stockholm: Natur & Kultur; 2011. Swedish.
[33] Szabo A. Matematik origo. 2c [Mathematics origo. 2c]. 2nd ed. Stockholm: Sanoma Utbildning;
2012. Swedish.
[34] Alfredsson L, Erixon P, Heikne H. Matematik 5000. kurs 1a gul, lärobok [Mathematics 5000.
Course 1a yellow, textbook]. 1st ed. Stockholm: Natur & Kultur; 2011. Swedish.
[35] Hiebert J, Grouws DA. The effects of classroom mathematics teaching on students’ learning.
In: Lester FK, editor. Second handbook of research on mathematics teaching and learning: a
project of the National Council of Teachers of Mathematics. Charlotte (NC): Information Age
Publishing; 2007. p. 371–404.
[36] Hiebert J, Carpenter TP, Fennema E, Fuson K, Human P, Murray H, Oliver A, Wearne D.
Problem solving as a basis for reform in curriculum and instruction: the case of mathematics.
Educ Res. 1996;25(4):12–21.
[37] Schoenfeld AH. Ideas in the air: speculations on small group learning, environmental and
cultural influences on cognition, and epistemology. Int J Educ Res. 1989;13(1):71–88.
[38] Fuchs LS, Fuchs D, Hamlett CL, Phillips NB, Karns K, Dutka S. Enhancing students’ helping
behavior during peer-mediated instruction with conceptual mathematical explanations. Elem
School J. 1997;97(3):223–249.
[39] Hiebert J, Wearne D. Instructional tasks, classroom discourse, and students’ learning in second-
grade arithmetic. Am Educ Res J. 1993;30(2):393–425.
[40] Webb NM. Peer interaction and learning in small groups. Int J Educ Res. 1989;13(1):21–39.
[41] Webb NM. Task-related verbal interaction and mathematics learning in small groups. J Res
Math Educ. 1991;22(5):366–389.
[42] Webb NM, Mastergeorge A. Chapter 4: promoting effective helping behavior in peer-directed
groups. Int J Educ Res. 2003;39:73–97.
[43] Johansson M. Mathematical meaning making and textbook tasks. Learn Math. 2007;45(1):45–
51.
[44] Haggarty L, Pepin B. An investigation of mathematics textbooks and their use in English,
French, and German classrooms: who gets an opportunity to learn what? Br Educ Res J.
2002;28(4):567–590.
[45] Rezat S. Interactions of teachers’ and students’ use of mathematics textbooks. In: Gueudet G,
Pipin B, Trouche L, editors. From text to ‘lived’ resources: mathematics curriculum materials
and teacher development [electronic resource]. New York (NY): Springer; 2012. p. 231–245.