Transforming The Lowest-Performing Students - An Intervention

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

RESEARCH AND TEACHING

Transforming the Lowest-Performing


Students: An Intervention That Worked
By Louis Deslauriers, Sara E. Harris, Erin Lane, and Carl E. Wieman

M
We conducted a small-scale study ost postsecondary sci- General studies in psychology show
to investigate if a brief timely ence teachers are fa- that goals that are specific and reason-
intervention focusing on specific miliar with a depress- ably achievable in the short run are
study strategies would improve ing pattern: scores on effective at improving performance,
student performance in university early exams and assignments seem whereas very general and/or long-
science courses. We targeted low- to establish how well nearly all the term goals are not (Latham & Yulk,
performing students after the first students will do in a course. Nothing 1975; Steers & Porter, 1974). These
midterm exam in two courses the professor does in the many sub- relationships have been replicated in
(enrollments of 67 and 185) with sequent weeks of instruction seems university classroom situations. For
different student populations, one to help students who start out poorly example, Morgan (1985) showed that
with students in a very selective (Jensen & Moore, 2008). Several pub- students setting specific learning goals
physics program and the other lished studies (Freeman et al., 2007; articulating what they would be able
with a broad range of students in Jensen & Moore, 2008, 2009; Moore, to do as a result of their near-term
a general science elective course. 2005) and countless unpublished ex- studying showed improved learning,
In this intervention, instructors amples illustrate optional educational whereas those who set distant goals
either met personally with students interventions that were intended to or goals based on amount of time
or sent them a personalized improve the performance of low-per- for studying did not. These research-
e-mail. Students who met with an forming, postsecondary science stu- ers attribute much of the benefit of
instructor and discussed specific dents but were unsuccessful. These specific goals to the fact these goals
study strategies improved their interventions include providing extra allow more effective self-monitoring
performance from one exam to review sessions, practice problems, of learning. Researchers on memory
the next by up to 32 ± 7%, without additional office hours, help rooms, and meta-memory (Bjork, 1994) have
increasing their study time. These and more. The high-performing stu- shown that retention is improved by
students also reported changing dents typically avail themselves of doing anything that involves more
their study strategies during the these additional resources and im- extensive processing of the mate-
term more than other students. prove, whereas the lower perform- rial, and that self-testing is one of
Students who received an e-mail ing students do not (Jensen & Moore, the most effective ways of carrying
also improved their performance 2008, 2009; Moore 2005), though out such deeper processing. Targeted
but not more than would be low performers self-motivated to self-testing improves metacognition
expected without an intervention. seek help can show substantial gains by providing students with a more ac-
These results show that a focused with course-specific coaching (e.g., curate picture of what they have and
discussion advising a small number Chaplin, 2007). Hattie, Biggs, and have not mastered. Other strategies,
of specific study strategies can Purdie’s (1996) comprehensive meta- related to self-testing that improve
have a large impact on the lowest- analysis of general learning skills in- self-monitoring include generating
performing students in contexts terventions with university students summaries (Thiede & Anderson, 2003)
in which the new study strategies found that “the effects on study skills or even keywords (Thiede, Anderson,
are aligned with course structure, are minimal” (p. 126). A likely reason & Therriault, 2003). However, simple
expectations, and assessments. for this failure is that general study rereading after a delay, a standard re-
This is in sharp contrast to results skills interventions are too broad view practice used by many students,
obtained from most general study- in their scope and too distant from increases neither self-monitoring ac-
skills interventions. where they will be applied. curacy nor performance (Dunlosky &

80
Journal of College Science Teaching
Rawson, 2005), and worse, familiarity at a large, selective, public research Midterm 1. In course B, we further
can instill a false sense of mastery university. Students at this Canadian divided the failing students on the
(Reder & Ritter, 1992) under some university are comparable to those at basis of their self-reported study
circumstances. the most elite U.S. public institutions time (reported directly on the exam),
Even upper-level science students (Wieman & Sudmant, 2008). under the assumption that those who
do not automatically self-test, self- The first course in which we con- studied more than the class median
monitor, or make deliberate efforts to ducted an intervention was an intro- (six hours), yet failed, had inefficient
learn from mistakes (Mason & Singh, duction to modern physics (Course A), study strategies and likely could ben-
2010). The study skills intervention which covers standard topics in quan- efit most (n = 16).
described here applies research on tum mechanics and special relativity. We timed each intervention to
goal-setting and self-testing to course- Course A was taken by 67 students at take place immediately after the first
specific contexts with a deliberate the end of their second year in the en- midterm when students’ interest in
focus on low-performing students not gineering physics program, a selective improving their study habits was
expected to seek help on their own. and demanding program. The second likely to be high. Students received a
was an introductory oceanography personalized e-mail communicating
A targeted intervention to survey course (Course B). Course B that the instructors were concerned
improve study habits attracted 185 students from a wide about their performance. In Course
We conducted a small-scale experi- range of arts and science majors. The A, all 18 students were asked to meet
ment in which the academic success teaching in both courses was highly with an instructor, and 13 did (2 of
of students who began poorly was interactive, with clicker questions these 13 were out with extended
greatly improved through a timely and peer discussion (Mazur, 1997) illness after the intervention and so
personal intervention by course in- and many student questions. Course A are not considered in the subsequent
structors that concentrated on chang- also had regular small group in-class analysis and discussion). In Course B,
ing student study habits. Our inter- activities, two midterm exams, a final the 16 students who both failed and
vention encouraged students to take exam, graded weekly homework, and reported studying more than six hours
a few specific actions dealing with preclass reading assignments. Course were asked to meet, and 7 did. These
deeper processing of the course ma- B had three midterm exams, a final groups are considered the “meeting”
terial through self-testing, particular- exam, and five graded homework intervention students. This approach
ly as defined by the specific course assignments. Both courses provided is imperfect, because our meeting
learning goals. Although our choice learning goals to all students, and intervention students self-selected to
of intervention advice was some- exams were aligned with these goals. respond to our invitations; however,
what ad hoc, our approach aligns The interventions conducted in we have no reason to expect that
with tested theories about learning the two courses share fundamental these particular students would have
and goal setting. The criteria Morgan approaches but differ in details. We sought help on their own. In addi-
(1985) used for the goals students collected somewhat different data in tion, in Course B, the remaining 19
were trained to set are very similar to each course. Although this means we students received a different e-mail,
the criteria for the learning goals we do not have replication of all data sets, with no invitation to meet but with
articulated in the two courses studied given the combination of information specific study advice. About 20% of
here. Students were strongly encour- collected in both courses, we are able the students in each course received
aged in the interventions to use these to discuss a broader range of issues some kind of intervention.
specific learning goals and past as- regarding effects on students.
sessments to self-test and to consult Face-to-face meeting
notes and supplementary reading Who gets an intervention? interventions
material in a targeted manner. We identified students to target on Face-to-face intervention meetings
the basis of low scores on the first typically lasted 15–25 minutes. In
Our setting midterm. In Course A, we targeted Course A, students met individually
The intervention described here was the 18 students who scored in the with an instructor; in Course B, up to
conducted in two science courses bottom quartile. In Course B, we tar- three students met with an instructor
with different student populations geted the 35 students who had failed simultaneously. Following a brief in-

Vol. 41, No. 6, 2012 81


research and teaching

troduction, students were asked how were based on Bjork (2001) and that • Attend weekly (optional) problem-
they spent their study time. Com- recommend testing one’s learning solving sessions (Course A only).
monly, they answered that they tried while studying. We emphasized that
to memorize everything, without to improve exam performance, stu- These recommended practices had
any indication that they had priori- dents should actively test themselves been provided to all students in both
tized on the basis of relative impor- on the stated learning goals as they classes before the first midterm.
tance. We then discussed how stu- studied. Specific recommended ac- Nevertheless, in the intervention
dents could use the course learning tions included the following: meetings, the study practices listed
goals more effectively to target their previously were seldom, if ever,
studying. We showed specific ex- • Attempt to “do” each learning mentioned by students.
amples of learning goals (e.g., from goal by generating your own ex-
Course B: “Compare the amount and planations. E-mail interventions
type of energy emitted by objects • Consult course resources (notes, Although our original intentions were
at different temperatures”), practice reading, homework, sample prob- to meet with all the targeted students,
questions and/or homework prob- lems and solutions) in a targeted five students in Course A and nine
lems, and exam questions from the manner, to improve your ability students in Course B who were invit-
first midterm (e.g., aligned with the with a specific learning goal. Do ed to a face-to-face meeting with the
learning goal above: “This diagram not simply reread. instructor did not come. These stu-
shows the wavelengths of maximum • Match available assessments dents did not receive any additional
emission for the sun and the earth. (e.g., clicker questions, practice study advice beyond what was pro-
Where would the wavelength of questions, homework problems) vided to all students in these courses.
maximum emission occur for a hu- to specific learning goals, and test In Course B, we compared these 9
man?”), and pointed out how they yourself on all of those items by students with the 19 students who re-
were all related. We showed stu- creating your own responses be- ceived the e-mail with specific advice
dents that the exam was based on the fore looking at answers. Imagine (those who failed Midterm 1, studied
learning goals and that each exam alternate ways to test the goal, and less than six hours) but were not in-
question addressed a learning goal. test yourself with your own (or vited to a meeting. These two groups
We also mentioned study tips that other students’) questions. have very similar average Midterm 1
scores (p = .95) and similar gains on
TABLE 1 Midterm 2 compared with Midterm 1
(p = .85). On the basis of this compar-
Midterm scores and comparisons among groups. ison, we have combined all students
Midterm 1 Midterm 2 who received any type of e-mail,
Group (n) M2-M1(SD) p-value Effect size
mean (SD) mean (SD) with or without study advice, but did
Course A < .001* not meet with instructors, as the “e-
Meeting(11) 65.5 (8.8) 84.0 (8.1) 19±4 (13) < .001# 1.6# mail” intervention students (n = 5 for
.8§ 0.14§
Course A; n = 28 for Course B).
E-mail (5) 58.6 (11.6) 75.3 (8.7) 17±5 (12) .003# 1.5#
Others (48) 84.2 (9.6) 84.0 (9.6) 0±2 (12) Surveys and interviews
Course B < .001*
To gather information about any
Meeting (7) 37.1 (11.2) 69.1 (11.2) 32±7 (17) < .001# 2.2#
.08§ 0.76§
changes students made to their study
practices, the students in the bot-
E-mail (28) 38.7 (8.1) 58.7 (15.6) 20±3 (15) < .001# 1.3#
tom and next-to-bottom quartiles in
Others (134) 70.9 (11.3) 71.9 (15.4) 1±1 (14)
Course A (n = 23) completed a for-
Note: All p-values for interaction between midterm score and treatment, from repeated mal verbal survey about their study
measures analysis of variance with midterm score as a within-subject factor and
treatment as a between-subject factor. M1 = Midterm 1; M2 = Midterm 2. practices shortly after the end of the
term, about six weeks after the in-
*Comparison among three treatment groups (meeting, e-mail, and others)
tervention. This survey was verbally
#
Comparison between others and each intervention group separately
§
Comparison between meeting and e-mail treatment groups
administered and audio-recorded by

82
Journal of College Science Teaching
a person uninvolved with the course. FIGURE 1
Questions were asked about students’
typical study practices, their study Comparison of exam performance and gains among the meeting
practices in Course A, whether and intervention groups, e-mail intervention groups, and all other students
how they had changed their study for Course A and Course B.
practices during Course A, and what
advice that they would give other stu-
dents—both specifically for Course
A and for science courses in general.
To categorize student self-reported
study practices, the codes “notes,”
“homework,” “learning goals,”
“book,” “prereading,” “review ses-
sions,” “studied more,” and “other”
emerged from evaluation of students’
recorded open-ended responses to the
verbal survey questions. Once these
codes were established, responses
were independently coded by two of
the authors, with interrater reliability
over 90%. In addition, 25 students in vention group are remarkable, with a sion to the mean for low-performing
Course A, including most of the ver- student improving from 49% to 91% students in the absence of an interven-
bally surveyed students, participated in Course A and from 16% to 80% tion. A fourth group, “usual other”
in more informal, less-structured in- in Course B. Nearly all individuals students who passed Midterm 1 dur-
terviews with one of the course in- in the intervention groups increased ing any of five offerings of the course
structors. In Course B, at the end of their scores from Midterm 1 to Mid- (including the term in which interven-
term, students completed an online term 2, whereas only about half of tions were offered; n = 574), represent
survey question about whether they those not in the intervention groups regression to the mean for higher-per-
changed their study habits. increased their scores. Although two forming students (Figure 2).
of our intervention groups are quite To account for variability in exam
Results and discussion small (Course A e-mail and Course B averages and variance in different
Intervention students improve meeting), effect sizes for the interven- terms, all Midterm 1 and Midterm 2
performance tion groups range up to 2.2 for gains exam scores were transformed to z-
Both meeting and e-mail intervention made by the meeting intervention stu- scores within each exam. We ran a two-
groups significantly increased their dents in Course B (Table 1). way repeated measures ANOVA with
average scores on Midterm 2 over midterm z-scores as a within-subject
Midterm 1 compared with those not Improvement beyond factor and treatment (meeting, e-mail,
in the intervention groups (“others” “regression to the mean” and usual low) as a between-subject
in Table 1 and Figure 1). A two-way To test whether the lowest-performing factor. There is a marginally significant
repeated measures analysis of vari- students likely would have increased interaction between midterm z-scores
ance (ANOVA) with midterm scores their Midterm 2 exam scores anyway, and treatment, F(2, 196) = 2.8, p = .06.
as a within-subject factor and treat- in Course B we compared (a) meet- Repeating this analysis with two treat-
ment (meeting, e-mail, and others) as ing intervention students (n = 7), (b) ments at a time shows that the meeting
a between-subject factor shows a sig- e-mail intervention students (n = 28), intervention students significantly
nificant interaction between midterm and (c) “usual low” students who outperformed usual low students,
scores and treatment in both courses, failed Midterm 1 in this course during F(2, 169) = 5.5, p = .02, effect size
F(2, 61) = 14.2, p < .001 for Course A; four other terms, when no intervention = 0.91, and likely outperformed the
F(2, 166) = 32.1, p < .001 for Course was offered (n = 164). The usual low e-mail intervention group, F(2, 33) =
B. The extremes in the meeting inter- group represents the expected regres- 3.3, p = .08, effect size = 0.77 (Figure

Vol. 41, No. 6, 2012 83


research and teaching

2). However, the e-mail intervention tion. For Course A, we lack data from Did students improve because they
group showed the same gain as the previous terms for a parallel analysis. changed how they studied, changed
usual low group, F(2, 190) = 0.34, p Possible interpretations of our existing how much time they studied, or were
= .56, effect size = 0.12. Receiving Course A data include either (a) the simply motivated to study harder be-
an e-mail from an instructor, in this improvement shown by intervention cause the instructor showed personal
case, produced the same increase in students is not beyond regression to the concern with their success?
exam performance as simply know- mean, or (b) in this higher level class,
ing one failed the first exam, a typical e-mail and meeting interventions have Changing study habits in general
“wake-up call” for students (Bonner & approximately equivalent impacts. Ad- In Course B, in an online survey at the
Holliday, 2006). The usual other group ditional data from future courses will end of the term, students were asked,
showed expected regression toward help differentiate these options. “Did you modify your study habits
the mean, with negative average gains during the second half of the term?”
(Figure 2). The only treatment in this What did intervention students Answer choices were (a) Yes, a lot,
large introductory class that produced do differently? (b) Yes, a little, and (c) No. Of 102
a greater gain than might be expected Although these results establish that students who responded, 80% of the
simply because of regression toward the meeting intervention had an ef- meeting intervention students report-
the mean was a meeting interven- fect, they do not determine why. ed changing their study habits “a lot”
during the second half of the term (n
FIGURE 2 = 5), compared with only 25% of the
e-mail intervention students (n = 16)
For Course B, comparison of M2–M1 z-scores among the meeting
and 15% of the nonintervention stu-
intervention group, e-mail intervention group, usual low performers
dents (n = 81). Results from Fisher’s
who failed Midterm 1 during four other offerings of the course, and
usual other performers who passed Midterm 1 in any of five terms, Exact Test indicate that the distribu-
including the intervention term. Error bars represent standard error. M1 tion of answers from meeting inter-
= Midterm 1; M2 = Midterm 2. vention students is different from the
responses from the other two groups
(p = .006). These survey responses
from Course B imply that the inter-
vention (meeting in particular) may
have encouraged students to change
their study tactics, but for Course B,
we lack information regarding spe-
cific changes they made.

Adopting recommended study


strategies
For Course A, the formal verbal sur-
veys of 23 students show that meet-
ing intervention students adopt rec-
ommended study practices at much
higher rates than nonintervention stu-
dents. The two groups reported nearly
identical rates for using recommend-
ed study practices to prepare for Mid-
term 1 (Figure 3, top). The meeting
intervention verbal survey responses
regarding Midterm 1 were consistent
with what these students had said
previously during the intervention

84
Journal of College Science Teaching
meetings. However, 7 of 10 meeting willingness to help? Research in com- an e-mail from an instructor may not
intervention students reported newly munication behaviors has indicated motivate students more than merely
adopting recommended practices that students are more motivated receiving the information that they
to study for Midterm 2 (e.g., testing when they receive high levels of out- failed the first exam. Thus, an e-mail
themselves, targeting the learning of-class support (Jones, 2008). Stu- appears to be below the threshold for
goals, consulting the book, attending dent perceptions regarding whether an effect of instructor concern on per-
review sessions), whereas only 1 of instructors care about their learning formance, at least in Course B. It’s
13 higher-performing noninterven- positively correlate with student self- possible that the meeting intervention
tion students reported a change in reports of learning (e.g., McCroskey, students were responding to instruc-
these categories (Figure 3, bottom). Valencic, & Richmond, 2004; Teven tor concern, but (a) they also reported
One additional nonintervention stu- & McCroskey, 1997; these studies changing their study habits (Figure 3),
dent reported a change in studying did not measure performance). In in- and (b) they slightly decreased their
for Midterm 2, but that change was formal interviews with instructors in study time (Course B). If a student’s
to “study more,” rather than change Course A, some students did mention primary motivation was to try harder
tactics. This contrast between these instructor concern as a motivating fac- simply because he or she was in-
two groups suggests that the meet- tor. Figure 2 indicates that receiving formed that the instructor was paying
ing intervention influenced how these
students studied later in the course. FIGURE 3

No increase in study time For Course A, comparison of self-reported study tactics between the
meeting intervention group (n = 10 surveyed) and nonintervention
The meeting intervention students in
students in the second lowest quartile (based on Midterm 1 scores) of
Course B reported studying slightly the class (n = 13 surveyed). The four categories shown here are the four
less for Midterm 2 compared with codes that correspond to tactics discussed in the intervention meetings.
Midterm 1 (decreased from a me- Original codes “book” and “prereading” are combined in this figure into
dian of 11 hours to 10 hours), though “reading.” Top: Study tactics reported for Midterm 1. Bottom: New study
they still remained above the class tactics (not done for Midterm 1) reported for Midterm 2.
median (6 hours per exam). These
results imply that “studying harder”
was not generally responsible for the
improvement shown by meeting in-
tervention students. Both subsets of
the e-mail intervention groups (those
who studied more than 6 hours and
those who studied less than 6 hours)
moved toward the class median
study time for Midterm 2. The first
group’s median study time decreased
from 10 to 6 hours, whereas the latter
group’s median study time increased
from 3 to 8 hours. The increase in
study time by the latter group might
be expected simply from learning
that one has failed the first exam.

Why did intervention students


change?
Did intervention students improve
solely because an instructor expressed
concern about their performance and

Vol. 41, No. 6, 2012 85


research and teaching

attention, we might expect increased sive processing consistently leads to portant differs from instructors’ views
study time with little change in study substantially better learning and would (Bonner & Holliday, 2006; Hrepic,
tactics to be the most common re- imply much smaller, or even opposite, Zollman, & Rebello, 2007). We argue
sponse. Although we cannot discount effects on student exam performance that the intervention described here
instructor concern as a motivating from what we observed. This appar- worked largely because, in both these
factor, our other data indicates that ent contradiction between approaches courses, serious effort (as described in
nearly all students who participated in to learning inventory results and our Chasteen, Perkins, Beale, Pollock, &
intervention meetings made specific study can be understood by contrast- Wieman, 2011) was made to establish
changes in study practices in response ing our students’ perceptions of how clear and explicit learning goals and to
to the guidance given, allowing them to succeed in the courses studied here, ensure that all course elements (class
to prepare more efficiently and effec- compared with how to succeed in their time, homework, exams) were well
tively for assessments in these course other science courses. aligned with these goals. Specific study
contexts than other students. Students verbally surveyed in guidance “embedded within the teach-
Course A were specifically asked to ing context itself” (Hattie et al., 1996,
Will recommended study articulate advice they would give p. 131) does appear most effective in
strategies work in other courses? to other students, both to succeed our cases. The large gains we observed
Inventory surveys of student ap- in Course A and to succeed in other may not be evident in other university
proaches to learning (Biggs 1978, science courses. The contrast in their course settings.
1987; Entwistle, 2000; Entwistle responses to these two questions in-
& Ramsden, 1983; Pintrich, Smith, dicates that most courses do not have Recommended conditions
Garcia, & McKeachie, 1993) would course components aligned. In advice for an intervention
not predict a large effect for the inter- regarding success in Course A, the Clearly, an intervention must offer
vention we used. These studies con- meeting intervention students nearly advice that will actually help students
sistently identify a division between always listed the items discussed study more effectively. An intervention
learning approaches primarily based during the intervention. However, in which an instructor recommends
on simple memorization (“surface,” their advice for succeeding in other targeting the learning goals for study is
“rehearsal”) and approaches involv- science courses commonly focused likely to be effective only if the exam
ing deeper more extensive cognitive on “figuring out the instructor” and is also aligned with those learning
processing (Entwistle & McCune, what sort of exam questions he or she goals and students perceive that align-
2004). Correlations between deeper might ask, rather than mastering the ment. Much of the variation observed
learning approaches and course marks course content. For example, we were in studies examining correlations be-
range from modest (r values 0.1–0.2) initially surprised to learn during our tween student approaches to learning
to nonexistent, and surface approach- intervention meetings that students and course performance may be due to
es show small anticorrelations with didn’t automatically review the home- variations in the degree of alignment in
marks. In a setting similar to ours, a work, but in the formal verbal surveys, the different courses studied.
science course in a large North Ameri- these students indicated that in many
can public university, Zusho, Pintrich, of their science courses, they did not Implications
and Coppola (2003) found a correla- see a connection between the exams The most important result from this
tion of 0.13 between course mark and and homework problems. study is that it demonstrates specific
“rehearsal” (memorization) learning Our intervention advice was de- ways that instructors can improve
strategies, and an insignificant but signed to help students pursue deeper, the performance of students at the
negative correlation with deeper cog- more metacognitive study strategies, bottom of the class distribution.
nitive strategies of “organization,” guided by articulated learning goals From one exam to the next, this in-
“elaboration,” and “metacognition.” that are clear to both students and tervention moved the lowest-per-
These approaches-to-learning re- instructors (Simon & Taylor, 2009). forming students to near or slightly
sults, particularly from Zusho et al. Many students have difficulty figur- above the class median, without in-
(2003), would appear to contradict ing out what’s important to learn, creasing their reported study time.
cognitive psychology studies of mem- particularly in an unfamiliar domain, This result is all the more striking
ory showing how deeper, more exten- and their interpretation of what’s im- because of the quite basic nature of

86
Journal of College Science Teaching
the study advice offered. This study MIT Press. Hrepic, Z., Zollman, D. A., & Rebello,
skills intervention has features that Bjork, R. A. (2001). How to succeed N. S. (2007). Comparing students’
psychologists note as important for in college: Learn how to learn. and experts’ understanding of the
behavioral change, namely a small American Psychological Society content of a lecture. Journal of
number of quite specific actions em- Observer, 14, 13–25. Science Education and Technology,
bedded in the context in which they Chaplin, S. (2007). A model of student 16, 213–224.
are to be used. Future application success: Coaching students to develop Jensen, P. A., & Moore, R. (2008).
in other well-aligned courses of the critical thinking skills in introduc- Students’ behaviors, grades and per-
intervention in the small-scale study tory biology courses. International ceptions in an introductory biology
described here will test whether our Journal for the Scholarship of course. American Biology Teacher,
approach is broadly applicable.n Teaching and Learning, 1(2), 1–7. 70, 483–487.
Chasteen, S. V., Perkins, K. K., Beale, Jensen, P. A., & Moore, R. (2009). What
Acknowledgments P. D., Pollock, S. J., & Wieman, C. do help sessions accomplish in in-
This work is supported by the Univer- E. (2011). A thoughtful approach to troductory science courses? Journal
sity of British Columbia through the instruction: Course transformation of College Science Teaching, 38(5),
Carl Wieman Science Education Initia- for the rest of us. Journal of College 60–64.
tive. We are pleased to acknowledge the Science Teaching, 40(4), 24–30. Jones, A. C. (2008). The effects of
assistance of Grace Wood in carrying Dunlosky, J., & Rawson, K. A. (2005). out-of-class support on student sat-
out surveys and of the students in our Why does rereading improve meta- isfaction and motivation to learn.
courses who volunteered to be surveyed. comprehension accuracy? Evaluating Communication Education, 57,
Thanks to Ido Roll for statistics advice the levels-of-disruption hypothesis 373–388.
and to Roger Francois and May Ver for for the rereading effect. Discourse Latham, G. P., & Yulk, G. A. (1975). A
providing comparative data. Comments Processes, 40, 37–55. review of research on the applica-
from Wendy Adams and Sarah Gilbert Entwistle, N. J. (2000). Approaches to tion of goal-setting in organizations.
greatly improved this work. This work studying and levels of understanding: Academy of Management Journal, 18,
does not necessarily represent the views The influences of teaching and as- 824–845.
of the Office of Science and Technology sessment. In J. C. Smart (Ed.), Higher Mason, A., & Singh, C. (2010). Do
Policy or the U.S. Government. education: Handbook of theory and advanced physics students learn
research (Vol. 15, pp. 156–218). New from their mistakes without explicit
References York, NY: Agathon Press. intervention? American Journal of
Biggs, J. B. (1978). Individual and group Entwistle, N. J., & McCune, V. (2004). Physics, 78, 760–767.
differences in study processes. British The conceptual bases of study Mazur, E. (1997). Peer instruction: A
Journal of Educational Psychology, strategy inventories. Educational user’s manual. Upper Saddle River,
48, 266–279. Psychology Review, 16, 325–345. NJ: Prentice Hall.
Biggs, J. B. (1987). Student approaches Entwistle, N. J., & Ramsden, P. (1983). McCroskey, J. C., Valencic, K. M., &
to learning and studying. Melbourne, Understanding student learning. Richmond, V. P. (2004). Toward
Australia: Australian Council for London, England: Croom Helm. a general model of instructional
Educational Research. Freeman, S., O’Connor, E., Parks, J. W., communication. Communication
Bonner, J., & Holliday, W. (2006). How Cunningham, M., Hurley, D., Haak, Quarterly, 52, 197–210.
college science students engage in D., . . . Wenderoth, M. P. (2007). Moore, R. (2005). Who does extra-credit
note-taking strategies. Journal of Prescribed active learning increases work in introductory science courses?
Research in Science Teaching, 43, performance in introductory biology. Journal of College Science Teaching,
786–818. CBE—Life Sciences Education, 6(2), 34(7), 12–15.
Bjork, R. A. (1994). Memory and 132–139. Morgan, M. (1985). Self-monitoring of
metamemory considerations in Hattie, J., Biggs, J., & Purdie, N. (1996). attained subgoals in private study.
the training of human beings. In J. Effects of learning skills interventions Journal of Educational Psychology,
Metcalfe & A. Shimamura (Eds.), on student learning: A meta-analysis. 77, 623–630.
Metacognition: Knowing about know- Review of Educational Research, 66, Pintrich, P. R., Smith, D. A. F., Garcia,
ing (pp. 185–205). Cambridge, MA: 99–136. T., & McKeachie, W. J. (1993).

Vol. 41, No. 6, 2012 87


research and teaching

Reliability and predictive validity of caring with student learning and Zusho, A., Pintrich, P., & Coppola, P.
the Motivated Strategies for Learning teacher evaluation. Communication (2003). Skill and will: The role of
Questionnaire (MSLQ). Educational Education, 46, 1–9. motivation and cognition in the learn-
and Psychological Measurement, 53, Thiede, K. W., & Anderson, M. C. ing of college chemistry. International
801–813. M. (2003). Summarizing can im- Journal of Science Education, 25,
Reder, L. M., & Ritter, F. E. (1992). prove meta-comprehension accu- 1081–1094.
What determines initial feeling of racy. Contemporary Educational
knowing? Familiarity with question Psychology, 28, 129–160. Louis Deslauriers is a science teaching
terms, not with the answer. Journal of Thiede, K. W., Anderson, M. C. M., and learning fellow in the Physics Depart-
Experimental Psychology: Learning, & Therriault, D. (2003). Accuracy ment, Sara E. Harris (sharris@eos.ubc.
Memory, and Cognition, 18, 435–451. of metacognitive monitoring af- ca) is an instructor in the Earth & Ocean
Simon, B., & Taylor, J. (2009). What is fects learning of text. Journal of Sciences Department, Erin Lane is a for-
the value of course-specific learning Educational Psychology, 95, 66–73. mer science teaching and learning fellow
goals? Journal of College Science Wieman, C., & Sudmant, W. (2008). in the Earth & Ocean Sciences Depart-
Teaching, 39(2), 52–57. Quality of UBC students relative to ment, and Carl E. Wieman is director, all
Steers, R. M., & Porter, L. W. (1974). The other BC and major US universities with the Carl Wieman Science Education
role of test-goal attributes in employee (update January 2008 using PISA Initiative at the University of British Co-
performance. Psychological Bulletin, comparisons). Retrieved from http:// lumbia in Vancouver, British Columbia,
81, 434–452. pdf-book-free-download.com/Quality- Canada. Carl E. Wieman is currently on
Teven, J. J., & McCroskey, J. C. (1997). of-UBC-students-relative-to-other-BC- leave and with the Office of Science and
The relationship of perceived teacher and-major-US-Universities.html# Technology Policy.

88
Journal of College Science Teaching

You might also like