Kramrski Dudai

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 29

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/250144958

Group-Metacognitive Support for Online Inquiry in Mathematics with


Differential Self-Questioning

Article in Journal of Educational Computing Research · June 2009


DOI: 10.2190/EC.40.4.a

CITATIONS READS
54 2,063

2 authors, including:

Bracha Kramarski
Bar Ilan University
114 PUBLICATIONS 4,125 CITATIONS

SEE PROFILE

All content following this page was uploaded by Bracha Kramarski on 07 April 2015.

The user has requested enhancement of the downloaded file.


J. EDUCATIONAL COMPUTING RESEARCH, Vol. 40(4) 377-404, 2009

GROUP-METACOGNITIVE SUPPORT FOR


ONLINE INQUIRY IN MATHEMATICS WITH
DIFFERENTIAL SELF-QUESTIONING

BRACHA KRAMARSKI
VERED DUDAI
Bar-Ilan University

ABSTRACT

This exploratory study investigated 100 Israeli 9th graders who used two
different group-metacognitive support methods in online mathematical
inquiry—group feedback guidance (GFG) and self-explanation guidance
(SEG)—compared to a control group (CONT). The study evaluated each
method’s effects on students’: (a) mathematical inquiry ability: problem
solving, explanations, mathematical feedback in online forum discussions,
and transfer ability; and (b) self-regulated learning (SRL) measures (self-
report questionnaires and metacognitive feedback in online forum discus-
sions). Metacognitive support methods, based on IMPROVE self-questioning
strategies, appeared in pop-up screens and provided the two experimental
groups with differential cues for problem-solving processes. Mixed quanti-
tative and qualitative analyses showed that GFG students outperformed
SEG students in most mathematical and SRL measures and the CONT
students in all measures. In addition, SEG students outperformed the CONT
students in mathematical problem-solving but not on mathematical transfer
ability or SRL.

In recent years, the role of self-regulated learning (SRL) in education has elicited
much interest. Research has focused on students’ SRL and subject-matter knowl-
edge as means to attain successful learning. Students are considered self-regulated

377

Ó 2009, Baywood Publishing Co., Inc.


doi: 10.2190/EC.40.4.a
http://baywood.com
378 / KRAMARSKI AND DUDAI

to the degree that they are metacognitively, motivationally, and behaviorally


active participants in their own learning process (e.g., Pintrich, 2000; Zimmerman,
2000). Researchers believe that the role of metacognition is especially impor-
tant because it enables students to plan and allocate limited learning resources,
monitor their current knowledge, and evaluate their current learning level,
which in turn influences the whole SRL process (e.g., Pintrich, 2000; Schraw,
Crippen, & Hartley, 2006; Zimmerman, 2000).
Researchers have also begun to direct increasing attention to individual self-
regulation at the social level, where individual group members influence each
other through co-regulation (e.g., Kramarski & Mevarech, 2003; Salonen, Vauras,
& Efklides, 2005). Some researchers have focused on how group members
continuously regulate each other reciprocally, a process known as shared regu-
lation (Palincsar & Brown, 1984). Other researchers emphasized that meta-
cognition should be viewed as an essential part of a group’s work when cognitive
processes are regulated advantageously (Salonen et al., 2005) and should not
be considered only in the context of the individual’s knowledge and regulation
of cognitive processes (Brown, 1987; Flavell, 1979). At the social level, both
socially mediated and socially shared metacognition occur (Goos, Galbraith,
& Renshaw, 2002). In addition, some argue that a shared metacognitive experi-
ence during social discourse emerges (Lin, 2001), where peers or other group
participants act as external regulators (Azevedo, 2005).
Unfortunately, research has demonstrated that students have difficulties
in adopting SRL processes at both the individual and social levels (e.g.,
Kramarski & Gutman, 2006; Kramarski & Mevarech, 2003; Veenman, Elshout,
& Meijer, 1997). Students often do not realize that they should regulate their
ideas and do not know how to regulate productively. They forge ahead without
considering alternatives for their decisions, get bogged down in logistical details
of their work, and focus on superficial measures of progress. Consequently,
students are not open to sharing metacognition with other group members
(Kramarski & Mevarech, 2003).
Although SRL is seldom acquired spontaneously, it may be shaped and
developed through participation in environments that provide students with
opportunities to manage their own learning (e.g., Zimmerman, 2000). However,
research has indicated that mere participation in such a learning environment
is insufficient to develop SRL; metacognitive support is an essential prerequisite
(e.g., Veenman et al., 1997). One promising metacognitive instructional support
seems to be the use of self-questioning (King, 1991, Kramarski & Mevarech,
2003; Schoenfeld, 1992). Self-questioning can be directed to the problem-solving
process itself (e.g., comprehending the problem) or to different metacognitive
strategies such as providing self-explanation or feedback which are essential in
the SRL process.
In line with these claims about shared metacognition, self-questioning, and
differential metacognitive strategies, our study focused on comprehensively
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 379

investigating the effects of group-metacognitive support in online mathematical


inquiry for enhancing mathematical and SRL abilities, where students used meta-
cognitive self-questioning (based on the IMPROVE method) directed toward
one major metacognitive strategy (either self-explanation or feedback), and where
students participated in online forum discussions that elicited group members’
sharing of mathematical and metacognitive understanding. Thus, our research
investigated two main questions: How can different group-metacognitive sup-
ports: (a) enhance online mathematical inquiry and transfer ability and (b) develop
students’ self-perceived and online SRL? Such questions have received little
attention in the literature of mathematics education.
Prior to detailing the current exploratory study’s design, next we briefly over-
view the concepts utilized in our study: group-metacognition in online mathe-
matical inquiry, self-regulation of learning, and two strategies for metacognitive
support based on IMPROVE self-questioning.

Socially Shared Online Inquiry in Mathematics

In mathematics inquiry, students solve problems, pose questions, construct


solutions, and explain their reasoning (e.g., Schraw et al., 2006). Explanations
(also known as justifications) involve constructing, refuting, and comparing
arguments using various types of reasoning. Explanations have the potential for
engaging students, making students’ thinking visible, and refuting misconceptions
(Nussbaum & Sinatra, 2002).
During online inquiry, students participate in group forum discussions that
require group members to share meanings, both for metacognitive and mathe-
matical understanding. Students must explain their own thinking to other group
members and adapt their own thinking to the solutions proposed by other
members, which, in turn, may facilitate more efficient use of metacognitive skills.
Through critically examining others’ reasoning and participating in disagree-
ment resolution, students learn to monitor their thinking, which in turn fosters
their mathematical reasoning concepts (e.g., Artz & Yaloz-Femia, 1999; McClain
& Cobb, 2001).
In the present study, group-metacognition was identified as “socially shared”
if it met the following two criteria: one group member must post a metacognitive
regulation message, and at least one other group member must acknowledge that
message. In other words, first, a discussion forum message should be posted that
metacognitively regulates the groups’ ongoing problem solving by interrupting,
changing, or promoting the progression of joint mathematical problem solving.
Second, the other group members should utilize the metacognitive regulation
message, for example by applying feedback to their own problem-solving
processes and reporting the results in joint discussions. Such messages may be
directed to different stages of the solution process such as planning, monitoring,
and evaluation (Schraw et al., 2006; see extended description in the Method
380 / KRAMARSKI AND DUDAI

section). However, research indicates that mere engagement in mathematical


forum discussion is insufficient to foster mathematical inquiry ability and
SRL processes; metacognitive support is needed (e.g., King, 1992; Kramarski &
Mizrachi, 2006; Oh & Jonassen, 2007).

Self-Regulation of Learning and Two Strategies


for Metacognitive Support

SRL refers to a cyclical and recursive process that utilizes feedback mech-
anisms for students to understand, control, and adjust their learning accordingly
(e.g., Butler & Winne, 1995; Zimmerman, 2000). In terms of cognitive and
metacognitive processes, self-regulated students are good strategy users. They
plan, set goals, select strategies, organize, monitor, and evaluate at various
points during the acquisition process (Pintrich, 2000; Zimmerman, 2000).
Research has indicated that self-explanation and feedback are both important
metacognitive strategies in the SRL process. Several studies have shown that
students learn better when they explain instructional materials to themselves
(e.g., Aleven & Koedinger, 2002; Renkl 2002), or when students explain their
own problem-solving steps (e.g., Kramarski & Mevarech, 2003). Chi (2000)
believes that explanation helps participants organize knowledge, thus augment-
ing incomplete “mental models.” She suggests that children’s provision of
explanations enables them to combine new information with existing knowl-
edge to gain more complete representations. Also, some evidence indicates that,
depending on their explanations’ quality, students learn more when they provide
explanations than when they receive explanations (Webb, 1989).
According to Butler and Winne (1995), feedback serves a multidimensional
role in aiding knowledge construction. Feedback may encourage students to
engage in reflection on why their solution or explanation is wrong, and thereby
to update their solution strategies. Feedback stimulates students to adopt more
metacognitive strategies in their learning tasks. Research on feedback in
computer-based learning environments has shown differential effects for feedback
strategies on students’ learning. Corrective feedback helps immediate learning,
whereas guided and metacognitive feedback helps ensure deep understanding
and the ability to transfer knowledge (Aleven & Koedinger, 2002; Azevedo
& Bernard, 1995; Kramarski & Zeichner, 2001; Moreno, 2004). However,
most studies examined effects of metacognitive feedback provided by external
agents like computers, whereas little research was conducted on effects of
student exchanges of group-metacognitive feedback in online mathematical
inquiry discussion.
Furthermore, the literature indicates that learning either through self-
explanations or feedback is difficult. Not all students use these strategies spon-
taneously. It remains an open question as to how guidance methods that scaffold
or emphasize self-explanation or feedback compare to guidance methods that do
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 381

not emphasize such strategies. None of the aforementioned studies compared


learning through self-explanations with learning through group feedback in
online inquiry learning in mathematics with metacognitive support based on the
IMPROVE method (e.g., Schraw et al., 2006).

IMPROVE Metacognitive Self-Questioning

Metacognitive support aims to increase learning competence by means of


systematic explicit guidance to learners as they think and reflect on their tasks.
An explicit approach incorporates the ability to verbalize thinking patterns as
well as the ability to conceptualize and analyze relational structures that are
employed while thinking (Veenman, Van Hout-Wolters, & Afflerbach, 2006).
Self-questioning has been the most recommended method for explicit guidance.
Many researchers have emphasized the importance of extensive practice fol-
lowed by explicit guidance using the WWWH self-questioning strategy (what,
when, why, and how). This strategy helps students select a specific self-regulatory
strategy, approach, or response within learning (e.g., Azevedo & Cromley,
2004; Kramarski & Mevarech, 2003; Schoenfeld, 1992; Schraw et al., 2006;
Veenman et al., 2006).
To support students’ involvement in regulatory learning, Mevarech and
Kramarski (1997) designed the IMPROVE metacognitive self-questioning
method that represents the acronym of all classroom teaching steps: Introducing
new concepts; Metacognitive questioning; Practicing in small groups; Reviewing;
Obtaining mastery; Verification, and Enrichment and remediation. The meta-
cognitive questioning encourages students to actively engage in self-regulating
their learning by using four kinds of questions: comprehension, connection,
strategy, and reflection. Comprehension questions help students understand the
information of the task/problem to be solved (e.g., “What is the problem/task?”;
“What is the meaning of . . . ?”). Connection questions prompt students to
understand tasks’ deeper-level relational structures by articulating thoughts and
explicit explanations (e.g., “What is the difference/similarity?”; “How do you
justify your conclusion?”). Strategy questions encourage students to plan and to
select the appropriate strategy (e.g., “What is the strategy?”; “Why?”). Reflection
questions help students monitor and evaluate their problem-solving processes,
encouraging students to consider various perspectives and values regarding their
selected solutions (e.g., “Does the solution make sense?”; “Can the solution be
presented otherwise?”).
The IMPROVE method is grounded in the SRL theoretical framework. The
four metacognitive questions empower learners’ self-regulation. The questions
direct learners’ thoughts and actions throughout the cyclical SRL of the solution
process (planning, monitoring and evaluation; Zimmerman, 2000). The method
is also grounded in socio-cognitive theories of learning, which extend the view
of metacognition to encompass not only self-directed dialogue but also social
382 / KRAMARSKI AND DUDAI

aspects. Such aspects include practice of tasks and group discussion between
peers of comparable expertise, thus making monitoring and regulation processes
overt (Brown & Campione, 1994).
In general, research reported that metacognitive support with IMPROVE self-
questioning demonstrated positive effects on students’ learning outcomes and
SRL processes in different learning environments (e.g., Kramarski & Mevarech,
2003; Kramarski & Mizrachi, 2006). Most studies regarding self-questioning
(e.g., Ge & Land, 2003; Kauffman, Ge, Xie, & Chen, 2008; King, 1991;
Kramarski & Zoltan, 2008; Schoenfeld, 1992) examined this strategy’s effects
while directing self-questions throughout the solution process (e.g., “What is the
problem?”). However, no research on supportive group-metacognition explored
two differential self-questioning guidance strategies—self-explanation and group
feedback in online mathematical inquiry—compared to a control group.

Current Study Objectives

Based on research findings, we designed our exploratory study to compre-


hensively investigate the effects of two differential metacognitive self-questioning
methods (based on IMPROVE metacognitive support) for ninth grade students’
online mathematical inquiry and transfer ability and their SRL processes: self-
explanation guidance (SEG) and group feedback guidance (GFG). The SEG
self-questions encouraged students to provide an elaborated explanation (why)
for their thinking, to clearly refer to the data in the problem and to the problem-
solving process, and to suggest a conclusion when working online to solve
mathematical problems in forum discussions. The GFG self-questions encouraged
students to take a social perspective, and to provide elaborated feedback when
working online to solve mathematical problems in a forum discussion, which
demanded that the student reflect on the entire solution process of all participants.
The forums comprised small groups of four students.
Based on the emphasize on social-cognitive aspects in SRL process (e.g.,
Zimmerman, 2000) and prior studies regarding feedback effects on learning (e.g.,
Moreno, 2004), we assumed that the GFG method would be more effective in
enhancing online inquiry ability, transfer ability, and SRL than would the SEG
method, for the following reasons. First, the SEG method provides opportunities
to understand and self-regulate the solution process from the individual student’s
perspective, whereas providing online metacognitive feedback in the GFG group
enables the students to act as external regulators at a social level and to share
multidimensional perspectives regarding solution processes. The GFG method
may help learners monitor, evaluate, and modify their solutions (if necessary)
and may challenge them to try new ways to solve problems. Consequently, we
suggested that students exposed to a GFG social approach would more easily
augment their mathematical inquiry processes, transfer ability, and SRL than
would students exposed to SEG guidance.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 383

The present preliminary study in this field therefore uniquely investigated


the effects of GFG and SEG methods on students’: (a) online mathematical
inquiry abilities for: problem solving, providing qualitative explanations, furnish-
ing mathematical feedback in online forum discussions, and transferring learned
skills; and (b) SRL assessed by self-perceived questionnaire and metacognitive
feedback in online forum discussions. The GFG and SEG groups were compared
to a control group (CONT) that practiced mathematical problem solving in a
socio-cognitive computer setting but was not exposed to metacognitive support
or a planned forum discussion that demands SRL aspects.

METHOD

Participants
Participants were 100 ninth-grade students (47 boys, 53 girls) attending three
classes within one junior high school in central Israel. We assigned each class
to one of the three instructional methods: SEG, GFG, or CONT. The three classes
were heterogeneous in terms of math ability. No statistical differences in
mathematical knowledge were found between the three groups at pretest (see
Results section).

Instruction Methods

Metacognitively Supported Groups: SEG and GFG

The training procedure for both metacognitive strategies was applied over 5
weeks with small groups of four students each within each class, for a total of six
lessons per group. The procedure comprised two stages, both implemented in the
computer lab: pre-study preparation and study guidance.
Stage 1: Pre-study preparation—A week before the study, each of the two
groups (SEG and GFG) was separately exposed to a 1-day, 90-minute preparation
for online inquiry in mathematics (comprising two class lessons) that addressed
problem solving, explanations, and forum discussions.
In the first lesson (45 min.), focusing on problem solving, each group was
trained with the IMPROVE metacognitive self-questioning method to enhance
the problem-solving process within small groups (Kramarski & Gutman, 2006;
Kramarski & Mevarech, 2003; Kramarski & Mizrachi, 2006). The teacher
verbally explained the importance of IMPROVE metacognitive self-questioning
by using pop-up comprehension, connection, strategy, and reflection questions
(see Figure 1). She modeled the use of such questions during problem solving
of authentic tasks. Students in both groups practiced the use of these questions
with computerized materials in an online discussion within small four-member
groups. Students practiced these questions during the inquiry process.
384 / KRAMARSKI AND DUDAI

Figure 1. The IMPROVE metacognitive self-questioning method.

In the second lesson (45 min.), which focused either on constructing explan-
ations or on providing/receiving online feedback, the two groups’ training
differed. In the SEG group, the teacher verbally explained the importance of
constructing mathematical knowledge with explanations. The SEG students
practiced self-questioning techniques related to explanations and justifications
by using pop-ups (see Figure 2), for example: (1) “Is my use of mathematical
expressions correct?;” (2) “Would another argument be appropriate?;” (3) “ What
is my conclusion?;” and (4) “Is my explanation clear?”
Similarly, the GFG group was exposed verbally to the importance of group
feedback for constructing mathematical knowledge, and the teacher explicitly
modeled how to provide such feedback. Using pop-ups (see Figure 3), students
practiced self-questioning techniques related to the provision/receipt of feedback,
for example:
1. “Did I read my friend’s solution?”;
2. “Did I check if my friend’s explanation is correct and clear?”;
3. “How can I respond to my friend regarding the correctness of his/her
solution/explanation?”; and
4. “How can I modify my friend’s solution and explanations?”
In addition, both groups received computerized examples. The SEG students
received examples of explanations, and the GFG students received feedback
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 385

Figure 2. Self-questioning that provides self-explanation guidance (SEG).

Figure 3. Self-questioning that provides group feedback guidance (GFG).

examples. Students were asked to analyze these examples according to their


mathematical terms, representations, solving strategy, and clarity; to decide if the
examples were correct; and if not, to modify them.
Stage 2: Study guidance—Students from both groups (GFG and SEG) prac-
ticed online mathematical inquiry problem solving in forums of small four-
member groups, once weekly in the computer lab (45 min.) for 4 weeks. Students
386 / KRAMARSKI AND DUDAI

downloaded mathematical tasks from the school website. Tasks were based on
the Programme for International Student Assessment’s (PISA, 2003) con-
ceptual framework for solving authentic tasks that required different levels of
algebraic abilities.
Each student in both groups was asked to solve the weekly task, send his/her
solution to the forum, provide feedback for the forum partners’ solutions, receive
feedback for his/her own solution, and adjust his/her solution (if needed) accord-
ing to the peers’ suggestions. During the solution process, students received
the metacognitive self-questions (focusing either on the self-explanation or the
feedback strategy) as an electronic message by a pop-up screen at certain times.
Students were asked to use these prompts during the solution.
The teacher encouraged each student to provide feedback to the other three
students in the forum about their inquiry process. At the end of each weekly
meeting, each group sent their solutions and feedback exchanges to the teacher,
who then gave online feedback individually within 24 hours regarding the
whole inquiry process: accuracy of student’s solution, problem-solving process,
explanations, and feedback provision in the forum.

Control Group

The CONT group was also exposed to a pre-study preparation that referred
to mathematical problem solving using a socio-cognitive perspective. Before the
study, the teacher held in-class discussions that referred to the importance of
mathematical inquiry, authentic problem solving, mathematical explanations, and
discussing solutions with peers, but students were not explicitly exposed to any
metacognitive guidance methods. Their study guidance stage began in the first
week of the 5-week study period, with two class lessons on the first day, for a
total of six lessons. The CONT students practiced the same tasks as the two
metacognitive online groups in the computer lab. Students downloaded their tasks
from the school website, and then each task was presented frontally in the class
and solved using paper and pencil.
Students were encouraged to solve the tasks cooperatively, and they were
allowed to ask for peers’ help online (e-mail, forum). They submitted each
solution individually to the teacher electronically by e-mail or online forum.
The teacher then gave online feedback individually within 24 hours regarding
accuracy of students’ solution and problem-solving process.

Teacher Background and Training

Three female teachers taught participants. Each held a university degree in


mathematical education, had 10+ years of math teaching experience, and was
considered by the principal to be an expert teacher in the school. For purposes
of this study, each teacher underwent a separate 1-day, 5-hour inservice training
seminar at the school. Training focused on mathematical and pedagogical issues
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 387

related to teaching mathematical inquiry. The training instructor (one of the


authors) informed teachers that they were participating in an experiment using
new materials. In the first 1.5 hours of training, the GFG and SEG teachers were
each introduced to the rationale for their specific metacognitive method and
received student materials and a teacher’s guide. The instructor discussed the
importance of metacognitive self-questioning in fostering mathematical inquiry,
and she modeled ways for using pop-ups on the computer screen to introduce
the specific metacognitive method in class.
In the remaining 3.5 hours of training, the instructor discussed each authentic
task’s goals with the teacher. The teacher was asked to solve tasks and consider
possible difficulties she might encounter in class. Particular attention was paid to
practicing and discussing different conceptual mathematical explanations in the
SEG group, and different types of feedback provision in the GFG group. Specific
consideration was given to the teacher’s role during students’ work on computer.
Teachers were guided how to clarify but not directly answer students’ questions.
Finally, teachers were guided in how to provide online feedback regarding the
final problem-solving version that students would send them.
Unlike the two teachers of metacognitive methods, the CONT teacher was not
introduced to any metacognitive method. However, she received the same amount
and structure of training on teaching mathematical inquiry (e.g., problem solving,
providing explanations, and coping with student difficulties).
During the 4-week study period, the second author observed both teachers
once a week (in the lab) to help ensure adherence to implementation of instruc-
tional methods. In addition, the authors met each teacher after these observations
and discussed any deviations from the method, to monitor the three groups’
treatment fidelity.

Measures
Five kinds of measures were administered. All three groups completed
a paper-and-pencil mathematical pre-knowledge test (pretest), an authentic
mathematical problem-solving task (posttest), a paper-and-pencil transfer test
(at posttest), and a pre/post SRL self-report measure. For GFG and SEG students
only, data were collected on two types of student feedback (mathematical and
metacognitive) given during the posttest of the online inquiry discussion of the
authentic mathematical problem-solving task.

Mathematical Measures

Pre-knowledge mathematical test—To control for possible differences prior


to the study, all students completed a 10-item pretest adapted from Kramarski
and Mizrachi (2006). The test covered algebraic procedural and conceptual
understanding (e.g., using symbols, manipulations of algebraic expressions, and
mathematical representations). For each item, participants received a score of
388 / KRAMARSKI AND DUDAI

either 1 (correct answer) or 0 (incorrect answer), and a total score ranging from
0 to 10. We translated the scores into percentages. Cronbach alpha reliability
coefficient was .81.

Authentic mathematical inquiry task: Problem solving and explanations—An


authentic mathematical inquiry task was adapted from PISA (2003) and adminis-
tered at posttest to all three groups. Each group solved the task according to the
mode they practiced in class: in an online discussion environment for GFG and
SEG students, and as paper-and-pencil for CONT students. The six-item task
assessed students’ procedural and conceptual understanding of patterns of change
and relationships. Students compared growth of apple trees planted in a square
pattern and conifer trees planted around the orchard and explained their reasoning.
We assessed the CONT group’s problem solving with a paper-and-pencil test
and not in a forum because reflecting on the learning process in the forum is
a metacognitive activity. Therefore, we were concerned about the possible con-
founding effect of using implicit metacognitive support. For each of the six items,
students received 1 point for submitting a correct solution; thus, total scores
ranged from 0 to 6 for solutions’ accuracy. Regarding their explanations, students
also received a mathematical grade of either 1 (entirely correct answer or argu-
ment) or 0 (incorrect answer or argument) for each item; thus, total scores ranged
from 0 to 6 for explanations’ mathematical accuracy. We converted both scores
into percentages. Cronbach alpha reliability coefficient was .79.
In addition, for correct mathematical explanations only, the quality of each
explanation was coded as either a conceptual argument (e.g., logical-formal,
scoring 2) or a procedural argument (e.g., calculation example, scoring 1). We
converted the explanation quality scores into percentages. Two experts in teaching
mathematics scored students’ explanations. The inter-judge reliability for Cohen
Kappa was 0.92.

Mathematical feedback during online inquiry discussion of the authentic task—


For students in the GFG and SEG groups only, mathematical feedback was coded
according to the following five categories (NCTM, 2000): mathematical terms
(e.g., bigger than), mathematical representations (e.g., tables, graphs), mathe-
matical explanations, final solution accuracy, and non-mathematical statements
(e.g., social communication like “we enjoyed working with you”). For each
mathematical feedback category, each student’s score was calculated by the
total number of references he/she provided to that category during online
discussion, divided by the total statements in the forum of the posttest task (i.e.,
60). Scores were converted into percentages. The Appendix (middle column)
exemplifies scoring for mathematical feedback categories based on excerpts
from online discussion.

Mathematical transfer test—This 18-item paper-and-pencil test examined all


student groups’ (GFG, SEG, CONT) posttest ability to transfer the mathematical
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 389

knowledge and skills learned in the instructional unit to the more formal, less
authentic tasks typifying their usual math tasks in school. In addition, the test
examined high-order skills (problem posing) that students didn’t practice. Thus,
the test can be considered as far-transfer for the context (formal tasks) and skills
(posing a problem) for all groups. The test addressed three kinds of skills:
procedural (9 items), problem-solving (8 items), and constructing a mathematical
problem using a given formula (1 item). In addition, students’ ability to provide
mathematical explanations was assessed (9 items). Cronbach alpha reliability
for the entire test was .86.
For each item, students received 1 point for providing the correct solution (or
0 for an incorrect solution). Thus, total scores ranged from 0 to 9 for solution
accuracy of procedural skills, 0 to 8 for problem-solving skills, and 0 to 1 for
posing a mathematical problem. Students also received an explanation grade
of either 1 (entirely correct argument) or 0 (incorrect argument) for each item.
Thus, a total score ranged from 0 to 9 for explanations’ accuracy. We converted
the scores into percentages.

SRL Measures

Self-report questionnaire—To compare students’ self-perceived SRL at the


beginning and end of the study, we used a 25-item self-report questionnaire
adapted from Montague and Bos (1990) and Kramarski and Mizrachi (2006).
The questionnaire assessed students’ self-perceived strategy use along the
mathematical problem-solving process: (a) cognitive strategies like memorizing
(e.g., “Prior to attempting to solve the problem, I try to remember a similar
task that I already solved”); (b) metacognitive strategies like monitoring (e.g.,
“During the solution process I ask myself if the solution makes sense”); and
(c) evaluation strategies (e.g., “After I solved the problem, I thought about
alternative ways of solving the problem”). Participants rated each item on a
5-point Likert-type scale ranging from 1 (never) to 5 (always). Cronbach alpha
reliability coefficient was .85.
Metacognitive feedback during online inquiry discussion of authentic task—
For students in the GFG and SEG groups only, metacognitive feedback was coded
according to the following four categories of cognition regulation (Schraw &
Dennison, 1994): planning (e.g., “We suggest finding the pattern and not counting
the trees”); monitoring the solution process (e.g., “A drawing is missing from your
answer”); debugging errors (e.g., “You have a mistake in question 2; the multipli-
cation is incorrect”); and evaluation (e.g., “Your explanation lacks depth”).
For each metacognitive feedback category, each student’s score was calculated
by the total number of references he/she provided to that category during the
online discussion, divided by the total statements in the forum of the posttest
(i.e., 60). Scores were converted into percentages. Two experts in SRL scored
students’ metacognitive online feedback. Inter-judge reliability for Cohen Kappa
390 / KRAMARSKI AND DUDAI

was 0.87. The Appendix (right-hand column) exemplifies scoring for meta-
cognitive feedback categories based on excerpts from online discussion.
Procedure
Instruction began in classrooms in the second academic semester after obtaining
consent from teachers and the school principal, and continued for 5 weeks. At
pretest, all students completed the paper-and-pencil pre-knowledge mathematics
test. The SRL self-report was administered to all students at pretest and posttest by
the teachers in their classrooms. GFG and SEG students’ feedback (mathematical
and metacognitive) during the online forum discussion of the authentic task was
analyzed. At posttest, a near-transfer authentic mathematical inquiry task (PISA,
2003) was administered to all students: online for GFG and SEG students and
as a paper-and-pencil task for CONT students. Also at posttest, all students
completed in a paper-and-pencil test a more formal mathematical problem-solving
(far) transfer tasks. Students were told that the questionnaire’s purpose was to
determine the effectiveness of learning materials in mathematics.
RESULTS
Mathematical Measures
Pre-Knowledge Mathematical Test

One-way analysis of variance (ANOVA) indicated that, at pretest, no signif-


icant differences emerged between the three groups, F(2, 97) = 2.01, p > .05;
h2 = 0.08 on prior mathematical knowledge. Means were 83.30, 81.23, and
82.04 respectively for GFG, SEG, and CONT (SD = 16.80, 15.70, 14.70).
Authentic Mathematical Inquiry Task Performance

To examine the GFG, SEG, and CONT students’ mathematical problem solv-
ing and mathematical explanation skills, we performed one-way ANOVAs on
students’ solution accuracy, explanation’s mathematical accuracy, and explan-
ation quality scores for the authentic mathematical inquiry task. Table 1 presents
means and standard deviations of task scores by guidance method.
Results indicated significant differences between the groups (GFG, SEG, and
CONT) in accuracy of mathematical problem solving, F(2, 97) = 7.96, p < .001,
h2 = 0.25. Effect-size analysis indicated that the GFG students significantly
outperformed both the SEG students (Cohen’s d = 0.58) and the CONT students
(Cohen’s d = 0.92) in their accuracy of mathematical problem-solving. The SEG
students also outperformed the CONT students (Cohen’s d = 0.35). Significant
differences also emerged between the groups in providing accurate mathematical
explanations, F(2, 97) = 9.73, p < .001, h2 = 0.28. The GFG students significantly
outperformed both the SEG students (Cohen’s d = 0.63) and the CONT students
(Cohen’s d = 1.33). Also, the SEG students outperformed the CONT students
(Cohen’s d = 0.70).
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 391

Table 1. Meansa and Standard Deviations of Percentages for


Solving Authentic Mathematical Inquiry Task by Guidance Method

Method

Group Self-
feedback explanations Control
Task (n = 32) (n = 32) (n = 36)

Solution accuracy
M 86.71 74.48 67.43
SD 19.90 23.87 19.30

Explanation accuracy
(mathematical > non-mathematical)
M 49.88 37.79 24.33
SD 21.09 20.17 16.54

Explanation quality
(conceptual > procedural)
M 68.3 60.4 50.2
SD 14.3 13.8 15.2
aRange 0-100.

Further analysis of only those explanations that were correct indicated sig-
nificant differences between groups on argument quality, F(2, 97) = 3.97, p < .05,
h2 = 0.16. The GFG students significantly outperformed both the SEG students
(Cohen’s d = 0.53) and the CONT students (Cohen’s d = 1.26), providing
more conceptual and fewer procedural arguments. Also, SEG students outper-
formed CONT students (Cohen’s d = 0.71).

Mathematical Feedback

To compare the GFG and SEG groups’ ability to provide effective mathematical
feedback during online inquiry discussion, we performed a multivariate analysis
of variance (MANOVA), followed by an ANOVA for each feedback category.
Table 2 presents means and standard deviations of online mathematical feedback
by guidance method.
The MANOVA on providing mathematical feedback yielded significant dif-
ferences between groups, F(4, 59) = 7.68, p < .01, h2 = 0.24. Further ANOVA
analysis indicated significant differences between groups in four of the five
feedback categories: mathematical terms, F(1, 62) = 5.03, p < .05, h2 = 0.16;
mathematical representations, F(1, 62) = 11.15, p < .001, h2 = 0.32; mathematical
392 / KRAMARSKI AND DUDAI

Table 2. Meansa and Standard Deviations of Percentages for


Providing Mathematical Feedback by Guidance Method

Method

Group feedback Self-explanations


Feedback category (n = 32) (n = 32)

Mathematical terms
M 7.08 14.09
SD 15.02 19.2

Mathematical representations
M 26.04 18.93
SD 12.71 7.60

Mathematical explanations
M 25.06 16.15
SD 18.19 13.48

Final solution accuracy


M 25.21 32.03
SD 19.72 22.31

Non-mathematical statements
M 16.61 19.80
SD 10.90 15.47
aScores are percentages and calculated as total references provided for each category
divided by the total statements in the forum, multiplied by 100.

explanations, F(1, 62) = 7.46, p < .001, h2 = 0.28; and final solution accuracy,
F(1, 62) = 4.90, p < .05, h2 = 0.14. The GFG students significantly outperformed
the SEG students in giving feedback (see Table 2) that referred to mathematical
representations (Cohen’s d = 0.70) and to mathematical explanations (Cohen’s
d = 0.56). However, the SEG group outperformed the GFG group in referring
to mathematical terms (Cohen’s d = 0.40) and in accuracy of the final
solution (Cohen’s d = 0.32). No differences between groups emerged regard-
ing non-mathematical statements, F(1, 62) = 2.58, p > .05, Cohen’s d = 0.24,
h2 = 0.09.

Transfer of Learned Mathematical Skills

To examine the GFG, SEG, and CONT students’ ability to transfer mathe-
matical learning to a formal (far) transfer test, we performed a MANOVA,
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 393

followed by an ANOVA on each skill: procedural, problem-solving, problem-


posing, and providing math explanations. Table 3 presents means and standard
deviations of the paper-and-pencil transfer test by guidance method.
The MANOVA yielded significant differences between groups, F(4, 95) = 9.78,
p < .001, h2 = 0.28. Further analysis of ANOVAs indicated significant differ-
ences between groups in the transfer test regarding three skills: problem-solving,
F(2, 97) = 7.98, p < .01, h2 = 0.25; posing a problem, F(2, 97) = 4.20, p < .05,
h2 = 0.17; and providing mathematical explanations, F(2, 97) = 16.18, p < .001,
h2 = 0.36. No significant differences between groups emerged in procedural skills,
F(2, 97) = 1.03, p > .05, h2 = 0.12.
In problem-solving skills, the GFG students significantly outperformed both
the SEG students (Cohen’s d = 0.66) and the CONT students (Cohen’s d = 0.78).
No significant differences emerged between the SEG and CONT students
(Cohen’s d = 0.13). The same pattern appeared for mathematical explanations:
the GFG students significantly outperformed both the SEG students (Cohen’s
d = 0.94) and the CONT students (Cohen’s d = 1.52), and the SEG students
outperformed the CONT students (Cohen’s d = 0.57). Likewise, for problem-
posing skills, the GFG students significantly outperformed both the SEG students

Table 3. Meansa and Standard Deviations of Percentages for


Transfer of Problem-Solving Tasks by Guidance Method

Method

Group Self-
feedback explanations Control
Skill (n = 32) (n = 32) (n = 36)

Procedural
M 89.17 88.78 86.72
SD 14.30 15.52 14.38

Problem solving
M 86.43 71.39 68.53
SD 19.90 26.20 22.45

Problem posing
M 72.03 52.03 48.34
SD 14.62 15.13 16.71

Mathematical explanations
M 56.10 41.53 32.72
SD 14.70 15.00 16.32
aRange 0-100.
394 / KRAMARSKI AND DUDAI

(Cohen’s d = 1.29) and the CONT students (Cohen’s d = 1.53). However, no


significant differences emerged between the SEG and CONT students (Cohen’s
d = 0.24).

SRL Measures

Self-Report Questionnaire

To compare the three groups’ self-perceived SRL at pretest, we performed a


MANOVA that revealed no significant pretest inter-group differences regarding
self-regulation measures, F(3, 96) = 2.79, p > .05, h2 = 0.08. Table 4 presents
means and standard deviations of SRL measures by guidance method and time.
Further analysis using two-way repeated measures of variance (methods (3)
by time (2)) indicated significant differences on the main effect of time for

Table 4. Means and Standard Deviations of Self-Regulated Learning


Measuresa by Guidance Method and Time

Method

Group Self-
feedback explanations Control
(n = 32) (n = 32) (n = 36)

Skill Pre Post Pre Post Pre Post

Cognitive

M 3.71 3.84 3.69 3.81 3.70 3.78


SD 0.41 0.34 0.33 0.32 0.37 0.40

Metacognitive

Monitoring
M 3.81 4.13 3.90 4.03 3.87 4.00
SD 0.28 0.32 0.31 0.30 0.38 0.35

Evaluation
M 4.06 4.21 4.01 4.11 3.98 4.02
SD 0.36 0.33 0.34 0.35 0.37 0.38

Total

M 3.87 4.05 3.86 3.98 3.85 3.93


SD 0.34 0.35 0.37 0.32 0.37 0.38
aRange 1-5.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 395

students’ perceptions of each SRL component: cognitive strategies and the two
metacognitive strategies—monitoring and evaluation, F(2, 97) = 14.64, 7.56,
and 5.65, respectively, p < .01; h2 = 0.23, 0.21, and 0.16, respectively. However,
significant interactions emerged between method and time: for monitoring,
F(2, 97) = 4. 78, p < .05, h2 = 0.19, and for evaluation, F(2, 97) = 3.97, p < .05,
h2 = 0.16. Findings indicated that at posttest, GFG students reported signifi-
cantly more improvement in their use of self-monitoring and evaluation strategies
(Cohen’s d = 0.86, 0.40, respectively) compared to SEG students (Cohen’s
d = 0.45, 0.29, respectively) and CONT students (Cohen’s d = 0.34, 0.11,
respectively). However, no significant differences emerged between the SEG
and CONT students in their self-perceived monitoring and evaluation (Cohen’s
d = 0.09, 0.26, respectively). No significant interactions emerged between method
and time in improving the use of self-regulated cognitive strategies while problem
solving, F(2, 97) = 0.35, p > .05.

Metacognitive Feedback

To compare GFG and SEG groups’ ability to provide effective metacognitive


feedback during online inquiry discussion, we performed a MANOVA, followed
by an ANOVA for each feedback category. Table 5 presents means and standard
deviations of online metacognitive feedback by guidance method.
The MANOVA on providing metacognitive feedback yielded significant dif-
ferences between groups, F(4, 59) = 6.78, p < .01, h2 = 0.19. Further ANOVA
analysis indicated significant differences between groups in all four feedback
categories: planning, F(1, 62) = 3.9, p < .05, h2 = 0.12; monitoring, F(1, 62) = 4.3,
p < .05, h2 = 0.16; debugging, F(1, 62) = 4.4, p < .05, h2 = 0.13; and evalu-
ation, F(1, 62) = 5.63, p < .05, h2 = 0.17. The GFG students significantly out-
performed the SEG students in three categories: monitoring (Cohen’s d = 0.31);
debugging (Cohen’s d = 0.33); and evaluation (Cohen’s d = 0.40). However,
the SEG outperformed the GFG on planning strategies (Cohen’s d = 0.35).

DISCUSSION
The purpose of this exploratory study was to comprehensively investigate
online mathematical inquiry (problem solving, explanations, and feedback using
forum discussions) and transfer ability and SRL processes among students
exposed to one of two metacognitive support methods (SEG or GFG), as com-
pared to a control group. We found that students exposed to online guidance
in providing and receiving group feedback (GFG), based on the IMPROVE
self-questioning strategy, significantly outperformed students of the other two
groups: SEG students who also studied online but focused on metacognitive
guidance regarding explanation skills, and CONT students who received no
special metacognitive guidance. Better performance was notable in the GFG
396 / KRAMARSKI AND DUDAI

Table 5. Meansa and Standard Deviations of Percentages for


Providing Mathematical Feedback by Guidance Method

Method

Group feedback Self-explanations


Feedback category (n = 32) (n = 32)

Planning
M 32.84 39.48
SD 19.02 18.97

Monitoring
M 25.72 22.96
SD 9.12 8.61

Debugging
M 16.36 14.83
SD 4.11 5.14

Evaluation
M 25.08 22.73
SD 5.17 6.68
aScores are percentages and calculated as total references provided for each category
divided by the total statements in the forum, multiplied by 100.

group for mathematical problem-solving, and for the ability to provide higher-
order conceptual explanations based on logical-formal conclusions. Furthermore,
the SEG students outperformed the CONT students on the same measures.
Students’ online mutual mathematical feedback while attempting to solve an
authentic mathematical task indicated a significant difference between the GFG
and SEG groups’ ability to articulate their thoughts in mathematical terms. The
GFG students displayed a higher tendency to refer to mathematical representations
and explanations. The SEG students referred more often to mathematical terms,
and to the solution’s accuracy.
Moreover, GFG students outperformed SEG students and CONT students
in their ability to (far) transfer their learning to various tasks that required
high-order skills in a paper-and-pencil test, such as the ability to pose a mathe-
matical problem. No significant differences between the SEG and CONT students
emerged in transferring their problem-solving skills to new tasks. However,
the SEG students did outperform the CONT students in one measure, the quality
of their explanations for the new tasks, providing conceptual rather than merely
procedural justifications for their solutions to transfer tasks.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 397

Regarding students’ outcomes on SRL processes, the GFG students outper-


formed both the SEG and the CONT students on self-perceived SRL (monitoring
and evaluation). No differences between the SEG and CONT groups emerged on
these SRL measures. Students’ online metacognitive feedback indicated a signifi-
cant difference between the GFG and SEG groups’ ability to metacognitively
articulate their thoughts. GFG students displayed a higher tendency to consider the
solution process by monitoring, debugging, and evaluation. The SEG students
referred more often to strategy planning. The current findings raise some issues
and implications for further deliberation regarding the role of metacognitive
support in online mathematical inquiry, the ability to transfer problem-solving
skills, and the different aspects of SRL assessment.

Metacognitive Support in Online


Mathematical Inquiry
Several possible reasons may explain the beneficial effect of online meta-
cognitive support (GFG and SEG groups versus CONT group) on mathematical
inquiry. First, it seems that problem-solving strategies’ explicitness online, using
metacognitive IMPROVE tools, can help students think about the steps they
need to take in their solution to the problem, and can help them articulate their
mathematical thoughts. When students explain and justify their thinking, and
challenge the explanations of their peers, they also engage in clarifying their own
thinking and recognizing potential conflict points for further discussion (e.g.,
Kramarski & Mizrachi, 2006; Lampert, 1990).
We confirmed the assumption of SRL models that metacognitive regulation
needs feedback about strategy use (Butler & Winne, 1995), and our findings
substantiate previous research conclusions regarding the differential effects of
feedback types on computer-based learning (Kramarski & Zeichner, 2001;
Moreno, 2004). For example, Moreno found that exposing novice students to
multimedia that were supported by directed feedback helped them attribute
meaning to the process and promoted deeper learning than identical materials
using corrective feedback alone. Directed feedback, which produced higher
transfer scores, was attributed to reductions in cognitive load. However, as
discussed earlier, these studies applied external feedback provided by the
computer, unlike the current study that implemented students’ own feedback in
online discussion. We suggest that further research investigate students’ per-
formance more comprehensively under different conditions of online embedded
feedback, and examine how feedback relates to student variables such as reduc-
tions in mental load.

Ability to Transfer Problem-Solving Skills


Our study substantiates previous findings which demonstrated that metacog-
nitive support strengthens students’ transfer ability, even in situations where such
398 / KRAMARSKI AND DUDAI

support is removed (e.g., Biswas, Schwartz, Leelawong, & Vye, 2005; Davis &
Linn, 2000; King, 1991; Kramarski & Gutman, 2006; Kramarski & Mizrachi,
2006; Moreno, 2004). Our outcomes indicated positive results for both meta-
cognitive support methods (GFG and SEG) on the paper-and-pencil transfer
measures. The GFG students succeeded more in problem-solving processes and
conceptual mathematical explanations compared to the other groups; however,
the SEG students succeeded more in conceptual mathematical explanations
compared to the CONT students. The SEG group’s mathematical explanations
advantage can be explained by the fact that providing explanations can be con-
sidered near transfer as it resembles the type of training SEG students received.
In contrast, the CONT students exhibited the lowest gain in transfer ability
despite the fact that the transfer test was provided in the same paper-and-pencil
format in which they trained (near-transfer).
As conceptualized by Cooper and Sweller (1987), three variables contribute
to transfer ability: learners must master strategies for problem solving, develop
categories for sorting tasks that lead to similar solutions, and know how to
relate to previous knowledge for solving novel tasks. It seems that the SRL and
mathematical conceptual understandings that were empowered by the GFG meta-
cognitive support were a springboard for transferring students’ learning ability
to various tasks that required high-order skills (posing a problem) in a new
situation (paper-and-pencil) and in another context (formal task).

SRL Measures: Self-Perceived SRL and


Metacognitive Feedback Discussion

Our study is grounded in socio-cognitive theories of learning, which extend


the view of metacognition to encompass not only self-directed dialogue but
social aspects as well. Such aspects include group discussion between peers of
comparable expertise, thus making the processes of monitoring and regulation
overt (Salonen et al., 2005). The current study examined the process of SRL in
two complementary ways within one comprehensive experimental framework:
as analyzed from the self-perceived SRL and metacognitive feedback. Both
measures supported the conclusion that the GFG method was more effective in
developing students’ SRL. Our findings support previous conclusions that self-
questioning offers metacognitive tools that may help learners shift their attention
from procedural thinking to a metacognitive processing level, whereby they
consider strategies, establish sub-goals, and evaluate moves (Ge, Chen, & Davis,
2005; Kramarski & Gutman, 2006).
Several findings need further consideration. Why did the GFG students exhibit
higher SRL levels on both measures than the SEG group? Two possible reasons
may be suggested. First, group discussion requires sharing of understandings
in such a way as to not only activate the relevant prior knowledge but also
to implement social skills such as clarifications, feedback, and help seeking.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 399

Perhaps the feedback self-questioning method was more powerful in meeting


such demands, unlike the SEG learning method that requires more skills-based
knowledge and thus stimulates social skills less. Indeed, the GFG students more
often used high-order discussion by referring to mathematical representations
and explanations, and they addressed more metacognitive aspects like monitoring,
debugging, and evaluation, whereas the SEG students more often referred to
basic elements of discussion like mathematical terms, accuracy of solutions, and
strategy planning.
Second, although self-explanation is a key metacognitive strategy that sup-
ports learning with understanding, and the ability to apply learned knowledge to
problem-solving tasks (Chi, 2000), self-explanation is itself a complex cognitive
task. In the context of problem solving, it requires two simultaneous, coordinated
“processes”: one that develops a sequence of steps to solve the problem, and a
second that monitors and evaluates the accuracy and efficiency of that problem-
solving process. Analyzing discrepancies and making corrections adds further
complexity to the self-monitoring task. In contrast, while providing feedback,
learners operate their own process to generate a solution, and then compare it
against a partner’s solution. The two processes need not be performed simul-
taneously. This might reduce cognitive load, develop the awareness and capacity
to compare solutions, and, with time, make it easier to turn this capacity inward
(Biswas et al., 2005).
Why didn’t the SEG group exhibit higher self-perceived SRL levels than
the CONT group? Perhaps the self-explanation strategy’s complexity impaired
SEG students’ utilization of online facilities and metacognitive support to develop
the social-cognitive skills essential for SRL (Zimmerman, 2000) better than
the CONT group. Further research should examine more deeply and explicitly the
development of social-cognitive skills in online mathematical inquiry environ-
ments under different group-metacognition supports.

Practical Implications, Future Research,


and Limitations

The current study makes an important contribution to theoretical and practical


implications about fostering group-metacognition as an essential part of SRL for
online mathematical inquiry, specifically suggesting the merit of self-questioning
support using group feedback. SRL and mathematical inquiry in online environ-
ments using differential self-questioning supports comprise a relatively new topic
that has not yet been investigated in mathematics education. Our study supports
theoretical socio-cognitive SRL models (e.g., Zimmerman, 2000), and we suggest
that attention to self-questioning as a springboard for fostering mathematical
inquiry and SRL dimensions should be a continuing goal. Further studies should
devise and apply other metacognitive models for students studying in different
technology environments.
400 / KRAMARSKI AND DUDAI

Although this study potentially offers contributions to theoretical research


and practical implications, we nevertheless recognize several limitations inherent
here. First, assigning students to conditions as entire classes, with different
teachers, creates possible confounding variables. We attempted to reduce this
confounding effect by ensuring that the three participating classes were similar
in their pretest math achievements, heterogeneity, teacher background, training
methods for the different metacognitive programs, and supervision to help ensure
adherence to implementation of the instructional methods. Yet, it is possible
that implementation of each support by only one teacher and one classroom
could have confounded the classroom with the instructional support and that
the observed differences stemmed from teacher characteristics like different
levels of success or willingness to implement a metacognitive program. We
propose that further research should examine the effects of different metacognitive
supports on a larger scale, among many schools, classes, and teachers. We also
suggest that researchers investigate teachers’ willingness and self-efficacy to
teach mathematics while using metacognitive supports.
Second, our research design comprised two experimental groups exposed
to online environments and a control group that practiced paper-and-pencil tasks.
Although the direct posttest was implemented in the same format in which each
group trained (computer or paper-and-pencil, i.e., near transfer), these different
formats might have confounded findings. Future research should attempt to
verify the current findings by exposing control groups to the same technology
environment but without metacognitive support and by asking control participants
to complete a far-transfer (computer-based) task as well.
Third, the study investigated the effects of different group-metacognitive
supports on ninth graders’ online mathematical inquiry, immediately at the end
of the study. Further research would do well to conduct a long-term follow-up
for this claim (e.g., at 6 months and 12 months after intervention). We sug-
gest that effects be examined on other mathematical topics, transfer tasks
in different formats (near and far; computerized and paper-and-pencil), and
other ages.
Fourth, this study investigated group-metacognition by implementing online
mathematical inquiry in small groups (of four students each). We do not know
whether or not these results are generalizable to other kinds of online forums
such as pairs or large groups, calling for future research.
Finally, we recognize the need to deepen the use of mixed methods to assess
mathematical inquiry and SRL processes under different instructional supports.
For example, as typifying self-reports, the current self-perceived SRL measure
may tap conceptual awareness of those skills, ability to verbalize them due to
explicit training (GFG), or willingness to report engagement in them, rather
than actual skills. Methods that analyze students’ verbalizations such as thinking
aloud, observations, log files, and forum discourse may shed further light on
the differential benefits of various group-metacognitive supports.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 401

Although we acknowledge some limitations in this exploratory study, its results


support recommendations to capitalize on group-metacognitive support in online
mathematical inquiry discussion to enhance learning opportunities in mathematics
instruction (e.g., NCTM, 2000). We recognize the need to continue unraveling
how online mathematical inquiry and SRL emerge in different metacognitive
support environments using individual regulation and/or co-regulation. Future
researchers would do well to apply the current empirical directions to other
subject matters, age groups, perspectives, and populations; for example, com-
paring different kinds of online communities of learners such as large versus small
groups, older versus younger students, or children with versus without learning
disabilities. In particular, similar future outcomes among younger students will
support our recommendation to build a mathematical and SRL culture in schools
(NCTM, 2000). This work is a step in that direction.

APPENDIX
Example of Online Discussion Feedbacka on Mathematical
and Metacognitive Aspects
Mathematical Metacognitive
Feedback excerpt feedback type feedback type

“You wrote that according to the patterns of change Mathematical Monitoring


of the conifer (8n) and the apple (n2) trees, you will representation; the solution
substitute 8 and then find that the amounts of trees Mathematical process
are equal (64). explanations;
Final solution Planning
We suggest another way to reach the solution: accuracy
8n – n2; n2 – 8n = 0; n(–8) = 0; n = 8. By solving
the equation we found two solutions: n = 0 or
n = 8, but the answer is n = 8, the solution of
n = 0 doesn’t fit the situation.”

“. . . with regard to the pattern, you wrote that each Mathematical Debugging
line of apple trees increases by 8 trees (i.e., n + 8). terms;
However, the pattern should be n*8. Please correct Mathematical
the mistake.” representation

“You are great. You modified your solution. Now Mathematical Evaluation
this is a good piece of work. The table you added representation;
helped to understand the algebraic pattern, the Mathematical
wording is excellent. explanation;
Non-mathematical
I really enjoyed your work.” statement
aFeedback refers to the question: Is it possible that the apple trees’ number equals
the conifer trees’ number, for any number (n) of rows of apple trees? Explain how you
found your answer.
402 / KRAMARSKI AND DUDAI

REFERENCES
Aleven, V., & Koedinger, K. R. (2002). An effective metacognitive strategy: Learning by
doing and explaining with a computer-based cognitive tutor. Cognitive Science, 26,
147-179.
Artz, A., & Yaloz-Femia, S. (1999). Mathematical reasoning during small-group problem
solving. In L. Stiff & F. Curio (Eds.), Developing mathematical reasoning in grades
K-12: 1999 yearbook of National Council of Teachers of Mathematics (pp. 115-126).
Reston, VA: National Council of Teachers of Mathematics.
Azevedo, R. (2005). Computer environments as metacognitive tools for enhancing learn-
ing. Educational Psychologist, 40(4), 193-197.
Azevedo, R., & Bernard, R. M. (1995). A meta-analysis of the effects of feedback
in computer-based instruction. Journal of Educational Computing Research, 13(2)
111-127.
Azevedo, R., & Cromley, J. G. (2004). Does training of self-regulated learning facili-
tate student’s learning with hypermedia? Journal of Educational Psychology, 96(3),
523-535.
Biswas, G., Schwartz, D., Leelawong, K., & Vye, N. (2005). Learning by teaching: A
new agent paradigm for educational software. Applied Artificial Intelligence, 19(3),
363-392.
Brown, A. L. (1987). Metacognition, executive control, self-regulation and other mysteri-
ous mechanisms. In F. Weinert & R. Kluwe (Eds.), Metacognition, motivation and
understanding (pp. 65-115). Hillsdale, NJ: Erlbaum.
Brown, A. L., & Campione, J. C. (1994). Guided discovery in a community of learners.
In K. McGilly (Ed.), Classroom lessons: Integrating cognitive theory with classroom
practice (pp. 229-270). Cambridge, MA: MIT Press.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical
synthesis. Review of Educational Research, 65(3), 245-281.
Chi, M. T. H. (2000). Self-explaining expository texts: The dual processes of generating
inferences and repairing mental models. In R. Glaser (Ed.), Advances in instructional
psychology (pp. 161-237). Mahwah, NJ: Erlbaum.
Cooper, G., & Sweller, T. (1987). Effects of schema acquisition and rule automation
on mathematical problem-solving transfer. Journal of Educational Psychology, 79,
347-362.
Davis, E. A., & Linn, M. C. (2000). Scaffolding students’ knowledge integration: Prompt
for reflection in KIE. International Journal of Science Education, 22(8), 819-837.
Flavell, J. (1979). Metacognition and cognitive monitoring: A new area of cognitive-
developmental inquiry. American Psychologist, 34, 906-911.
Ge, X., Chen, C-H., & Davis, K. A. (2005). Scaffolding novice instructional designers’
problem-solving processes using question prompts in a web-based learning environ-
ment. Journal of Educational Research, 33(2), 219-248.
Ge, X., & Land, S. M. (2003). Scaffolding students’ problem-solving processes in
an ill-structured task using question prompts and peer interactions. Educational
Technology Research and Development, 51(1), 21-38.
Goos, M., Galbraith, P., & Renshaw, P. (2002). Socially mediated metacognition:
Creating collaborative zones of proximal development in small group problem solving.
Educational Studies in Mathematics, 49, 193-223.
SUPPORT FOR ONLINE MATHEMATICAL INQUIRY / 403

Kauffman, D. F., Ge, X., Xie., K, & Chen., C. H. (2008). Prompting in web-based environ-
ments: Supporting self-monitoring and problem solving in college students. Journal
of Educational Computing Research, 38(2), 115-137.
King, A. (1991). Effects of training in strategic questioning on children’s problem-solving
performance. Journal of Educational Psychology, 83(3), 307-317.
King, A. (1992). Facilitating elaborative learning through guided student-generated ques-
tioning. Educational Psychologist, 27(1), 111-126.
Kramarski, B., & Gutman, M. (2006). How can self-regulated learning be supported in
mathematical e-learning environments? Journal of Computer Assisted Learning, 22,
24-33.
Kramarski, B., & Mevarech, Z. R. (2003). Enhancing mathematical reasoning in the
classroom: Effects of cooperative learning and metacognitive training. American
Educational Research Journal, 40(1), 281-310.
Kramarski, B., & Mizrachi, N. (2006). Online discussion and self-regulated learning:
Effects of instructional methods on mathematical literacy. Journal of Educational
Research, 99(4), 218-230.
Kramarski, B., & Zeichner, O. (2001). Using technology to enhance mathematical reason-
ing: Effects of feedback and self-regulation learning. Educational Media International,
38(2/3), 77-82.
Kramarski, B., & Zoltan, S. (2008). Using errors as springboards for enhancing mathe-
matical reasoning with three metacognitive approaches. The Journal of Educational
Research, 102(2), 137-151.
Lampert, L. (1990). When the problem is not the question and the solution is not the
answer: Mathematical knowing and teaching. American Educational Research
Journal, 27(1), 29-63.
Lin, X. (2001). Designing metacognitive activities. Educational Technology Research
and Development, 49, 23-40.
McClain, K., & Cobb, P. (2001). An analysis of development of sociomathematical norms
in one first-grade classroom. Journal for Research in Mathematics Education, 32,
236-266.
Mevarech, Z. R., & Kramarski, B. (1997). IMPROVE: A multidimensional method for
teaching mathematics in heterogeneous classrooms. American Educational Research
Journal, 34, 365-394.
Montague, M., & Bos, C. S. (1990). Cognitive and metacognitive characteristics of
eighth-grade students’ mathematical problem solving. Learning and Individual
Differences, 2, 371-388.
Moreno, R. (2004). Decreasing cognitive load for novice students: Effects of explanatory
versus corrective feedback in discovery-based multimedia. Instructional Science, 32,
99-113.
National Council of Teachers of Mathematics (NCTM). (2000). Principles and standards
for school mathematics. Reston, VA: Author.
Nussbaum, E. M., & Sinatra, G. M. (2002). On the opposite side: Argument and conceptual
engagement in physics. Paper presented at the meeting of the American Educational
Research Association, New Orleans, LA.
Oh, S., & Jonassen, D. H. (2007). Scaffolding online argumentation during problem
solving. Journal of Computer Assisted Learning, 2, 95-110.
404 / KRAMARSKI AND DUDAI

Palincsar, A., & Brown, A. (1984). Reciprocal teaching of comprehension fostering


and monitoring activities. Cognition and Instruction, 1, 117-175.
Pintrich, P. R. (2000). Multiple goals, multiple pathways: The role of goal orientation in
learning and achievement. Journal of Educational Psychology, 92, 544-555.
Programme for International Student Assessment (PISA). (2003). Literacy skills for the
world of tomorrow: Further results from PISA 2000. Paris: Author.
Renkl, A. (2002). Learning from worked-out examples: Instructional explanations supple-
ment self-explanations. Learning and Instruction, 12, 529-556.
Salonen, P., Vauras, M., & Efklides, A. (2005). Social interaction: What can it tell
us about metacognition and coregulation of learning? European Psychologist, 10,
199-208.
Schoenfeld, A. H. (1992). Learning to think mathematically: Problem solving, meta-
cognition, and sense making in mathematics. In D. A. Grouws (Ed.), Handbook of
research on mathematics teaching and learning (pp. 165-197). New York: Macmillan.
Schraw, G., Crippen, K. J., & Hartley, K. (2006). Promoting self-regulation in science
education: Metacognition as part of a broader perspective on learning. Research in
Science Education, 36, 111-139.
Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contempo-
rary Educational Psychology, 19, 460-475.
Veenman, M. V. J., Elshout, J. J., & Meijer (1997). The generality vs domain-specificity
of metacognition skills in novice learning across domains. Learning and Instruction,
7(2), 187-209
Veenman, M. V. J., Van Hout-Wolters, B., & Afflerbach, P. (2006). Metacognition
and learning: Conceptual and methodological considerations. Metacognition and
Learning, 1(1), 3-14.
Webb, N. M. (1989). Peer interaction and learning in small groups. International Journal
of Education Research, 13, 21-39.
Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary
Educational Psychology, 25, 82-91.

Direct reprint requests to:


Dr. Bracha Kramarski
School of Education
Bar-Ilan University
Ramat-Gan 52900, Israel
e-mail: kramab@mail.biu.ac.il

View publication stats

You might also like