Professional Documents
Culture Documents
BOE Packet
BOE Packet
BOE Packet
How will students learn physics to obtain a deep understanding of the essentials?
We currently teach physics using Modeling Instruction whenever possible. Frank was trained
in Modeling Instruction at a 2.5 week summer institute at Buffalo State in 2004. Jim
attended 2.5 weeks of summer training at Arizona State in 2005. Modeling instruction has
embraced Tony Wagner’s 7 essential skills long before he wrote the book! Instead of relying
on lectures and textbooks, Modeling Instruction has students design experiments to
determine physics concepts and relationships themselves. Students must reconcile the
results of their experiments with their own (often naïve views) of how the world works. They
then present their results to the class so we can come to a consensus. This entire process
takes much more time than a traditional-lecture based approach, but the depth of student
understanding and retention is much greater.
[GREEN Attached: Modeling Instruction article]
How will you know students are performing at the same level without the exam?
I am implementing a criterion-based grading system. Learning goals for teach topic are
clearly laid out and students are graded using a 1-4 rubric for each goal. Every HW, class
discussion, quiz, project, etc. can be used as evidence to see how well students are
reaching these learning goals. Students are evaluated on eventual mastery and any initial
missteps while exploring the content are not considered. This system ensures I know (and
students know) whether students have the essential skills and knowledge needed for
success in college science.
[WHITE Attached: Samples criterion-based grading and rubrics used by students]
Depth Versus Breadth: How
Content Coverage in High School
Science Courses Relates to Later
Success in College Science
Coursework
MARC S. SCHWARTZ
University of Texas, Arlington, TX 76019, USA
ROBERT H. TAI
Curriculum, Instruction, and Special Education Department, Curry School of Education,
University of Virginia, Charlottesville, VA 22904, USA
DOI 10.1002/sce.20328
Published online in Wiley InterScience (www.interscience.wiley.com).
ABSTRACT: This study relates the performance of college students in introductory science
courses to the amount of content covered in their high school science courses. The sample
includes 8310 students in introductory biology, chemistry, or physics courses in 55 randomly
chosen U.S. colleges and universities. Students who reported covering at least 1 major topic
in depth, for a month or longer, in high school were found to earn higher grades in college
science than did students who reported no coverage in depth. Students reporting breadth in
their high school course, covering all major topics, did not appear to have any advantage in
chemistry or physics and a significant disadvantage in biology. Care was taken to account
for significant covariates: socioeconomic variables, English and mathematics proficiency,
and rigor of their preparatory high science course. Alternative operationalizations of depth
and breadth variables result in very similar findings. We conclude that teachers should
use their judgment to reduce coverage in high school science courses and aim for mastery
by extending at least 1 topic in depth over an extended period of time. !C 2008 Wiley
DISCUSSION
The baseline model (Table 2) shows the association between breadth and depth of high
school science curricula and first-course introductory-level college science performance.
The most striking analytical characteristic is the consistency of the results across the three
baseline models, revealing the same trends across the disciplines of biology, chemistry,
and physics. In addition, the contrast of positive associations between depth and perfor-
mance and negative associations between breadth and performance is striking, though not
significant in all cases. This replication of results across differing disciplines suggests a
consistency to the findings. However, we caution that the baseline model offers only out-
comes apart from the consideration of additional factors that may have some influence on
these associations. The robustness of these associations was put to the test, as additional
factors were included in an extended model with additional analyses.
The extended model, shown in Table 3, included variables that control for variations
in student achievement as measured by tests scores, high school course level (regular,
honors, AP), and high school grades as well as background variables such as parent’s
educational level, community socioeconomic level, public high school attendance, and
year in college. With the addition of these variables in the extended model, we find that
the magnitude of the coefficients decreased in the majority of instances, but the trends
and their significance remained. Keeping in mind the generalized nature of the analyses
we have undertaken, the most notable feature of these results stems not simply from the
magnitude of the coefficients, but rather from the trends and their significance. Here, the
positive associations between depth of study and performance remain significant whereas
the negative association between breadth of study and performance retains its significance
in the biology analysis. Although this association is not significant in the chemistry and
physics analyses, the negative trend is consistent with our result for biology.
4
A significant interaction (p = .03) was found between the math SAT score and depth. In addition, a
second significant interaction (p = .02) was found between breadth and most advanced mathematics grade.
Apart from these two significant interactions, all others were nonsignificant.
Science Education
22 SCHWARTZ ET AL.
What these outcomes reveal is that although the choice to pursue depth of study has
a significant and positive association with performance, the choice to pursue breadth of
study appears to have implications as well. These appear to be that students whose teachers
choose broad coverage of content, on the average, experience no benefit. In the extended
model, we arrive at these results while accounting for important differences in students’
backgrounds and academic performance, which attests to the robustness of these findings.
The findings run counter to philosophical positions that favor breadth or those that advocate
a balance between depth and breadth (Murtagh, 2001; Wright, 2000).
Finally, a particularly surprising and intriguing finding from our analysis indicated that
the depth and breadth variables as defined in this analysis appear to be uncorrelated;
additionally, the interactions between both variables are not significant in either model.
Figure 6 offers an analysis of the outcomes treating breadth and depth as independent
characteristics of high school science curriculum. The first panel reveals that over 40%
of the students in our survey fall within the “depth present–breadth absent” grouping,
whereas the other three varied between 14% and 23%. The distribution across these four
groupings shows that the teachers of students in our survey are making choices about which
topics to leave out and which to emphasize. These choices have consequences. The second
panel indicates that those students reporting high school science experiences associated
with the group “depth present–breadth absent” have an advantage equal to two thirds of a
year of instruction over their peers who had the opposite high school experience (“Depth
Absent–Breadth Present”). This outcome is particularly important given that high school
science teachers are often faced with choices that require balancing available class time
and course content. It appears that even teachers who attempt to follow the “best of both
worlds” approach by mixing depth and breadth of content coverage do not, on average,
provide an advantage to their students who continue to study science in college over their
peers whose teachers focused on depth of content coverage.
Rather than relying solely on one definition of depth and breadth, we examined alternative
operationalizations of the breadth and depth variables and replicated the analysis. The
findings remained robust when these alternative operationalizations were applied.
However, a limitation in our definition of depth and breadth stems from the smallest unit
of time we used in the study to define depth of study, which was on the order of weeks.
Our analysis cannot discern a “grain size” smaller than weeks. Teachers may opt to study
a single concept in a class period or many within that period. Likewise, teachers might
decide that a single laboratory experiment should go through several iterations in as many
days, while others will “cover” three different, unrelated laboratories in the same time span.
These differences are beyond the capacity of our analysis to resolve. In the end, our work
is but one approach to the analysis of the concepts of depth and breadth generated from
students’ self-reports of their high school science classroom experiences.
students can revisit and rethink their ideas. This change can often be painstakingly slow,
taking much longer than many teachers allow in the rapid conveyance of content that is
the hallmark of a curriculum that focuses on broad coverage. In the view of Eylon and
Linn (1988), “in-depth coverage can elaborate incomplete ideas, provide enough cues to
encourage selection of different views of a phenomenon, or establish a well-understood
alternative” (p. 263). Focused class time is required for teachers to probe for the levels of
understanding attained by students and for students to elucidate their preconceptions. It
takes focused time for students to test their ideas and find them wanting, motivating them to
reconstruct their knowledge. Eylon and Linn note, “Furthermore, students may hold onto
these ideas because they are well established and reasonably effective, not because they are
concrete. For example, naı̈ve notions of mechanics are generally appropriate for dealing
with a friction-filled world. In addition, students appear quite able to think abstractly as
long as they have appropriate science topic knowledge, even if they are young, and even
if their ideas are incomplete. Thus, deep coverage of a topic may elicit abstract reasoning”
(p. 290).
The difficulty of promoting conceptual growth is well documented in neo-Piagetian
models (Case, 1998; Fischer & Bidell, 2006; Parziale & Fischer, 1998; Schwartz & Sadler,
2007; Siegler, 1998). These researchers argue that more complex representations, such as
those prized in the sciences, require multiple opportunities for construction in addition to
multiple experiences in which those ideas and observations may be coordinated into richer,
more complex understandings of their world. This process is necessarily intensive and time
consuming because the growth of understanding is a nonlinear process that is sensitive to
changes in context, emotions, and degree of practice. The experience of developing richer
understandings is similar to learning to juggle (Schwartz & Fischer, 2004). Learning to
juggle three balls at home does not guarantee you can juggle the same three balls in front of
an audience. Alternatively, learning to juggle three balls does not mean that you can juggle
three raw eggs. Changes in context and variables lead to differences in the relationships
that students have with ideas (both new and old) and their ability to maintain them over
time (Fischer & Pipp, 1984; Fischer, Bullock, Rotenberg, & Raya, 1993). The process
of integrating many variables that scientists consider in scientific models is a foreign and
difficult assignment for anyone outside the immediate field of research.
As just noted, the realization in the cognitive sciences that the learning process is not
linear is an important pedagogical insight. Because learning is a dynamic operation that
depends on numerous variables, leaving a topic too soon deprives students and teachers
the time to experience situations that allow students to confront personal understandings
and connections and to evaluate the usefulness of scientific models within their personal
models.
their pedagogy on ensuring that students can recite these “facts.” Clearly, high-stakes ex-
aminations that require recall of unrelated bits of scientific knowledge in the form of facts
and isolated constructs will increase the likelihood that teachers will adjust their teaching
methodologies to address these objectives. The more often discrete facts appear on high-
stakes examinations, the more often we imagine that teachers will feel pressured to focus
on breadth of knowledge. Conversely, we feel that the adoption of a different approach in
high-stakes testing that is less focused on the recall of wide-ranging facts but is, instead,
focused on a few widely accepted key conceptual understandings will result in a shift of
pedagogy. In such situations, we envision teachers’ instructional practice and curricula
designed to offer students greater opportunity to explore the various contexts from which
conceptual understandings emerge. We suspect that the additional focused time will allow
students to recognize (and teachers to challenge) students’ naı̈ve conceptions of nature.
These concerns fall in line with Anderson’s (2004, p. 1) three conclusions about
standards-based reform in the United States:
• The reform agenda is more ambitious than our current resources and infrastructure
will support.
• The standards advocate strategies that may not reduce achievement gaps among
different groups of students.
• There are too many standards, more than students can learn with understanding in
the time we have to teach science.
Anderson’s (2004) final conclusion strongly resonates with that of the NRC (2007)
and more specifically with findings, at the international level, by Schmidt et al. (1997,
2005). In a comparison of 46 countries, Schmidt et al. (2005) noted that in top-achieving
countries, the science frameworks cover far fewer topics than in the United States, and
that students from these countries perform significantly better than students in the United
States. They conclude that U.S. standards are not likely to create a framework that develops
and supports understanding or “coherence,” a strategy that encourages the development of
a deeper understanding of the structure of the discipline. By international standards, the
U.S. science framework is “unfocused, repetitive, and undemanding” (p. 532).
From the perspective of our study, both Schmidt et al. (2005) and Anderson’s (2004)
conclusions are particularly relevant. Clearly, increasing the quantity of information re-
quired for examination preparation will lead to an increased focus on broader coverage
of course content, an approach that our findings indicate is negatively (or certainly not
positively) associated with performance in subsequent science courses. If teachers know
that they and their students are accountable for more material, then the pressure to focus
on breadth would seem like the natural pedagogical choice to make. Hence, teachers must
decide whether they choose to maximize students’ test scores or maximize preparation
for success in future study. Although they have considerable feedback concerning the test
scores of their students (from state tests, SATs, ACTs, and AP examinations), they have
almost no knowledge of how the bulk of their students do in college science, save the few
who keep in touch. Moreover, the rare college student who reports back to their high school
science teacher is often the most successful and simply reinforces a teacher’s confidence in
his or her current methods.
Caveats
There are several concerns we wish to clearly elucidate. A potential criticism of this
study stems from a central element of this analysis, the length of time spent on a topic
Science Education
DEPTH VERSUS BREADTH 25
and how this variable might be influenced by the ability level of students to grasp the
material. One might imagine classrooms containing many struggling students actually
spending more time on particular topics and classes with generally high achieving students
requiring less time to grasp the concepts and quickly moving on to other topics. In this
scenario, depth of content coverage would be an indicator for remedial assistance by the
teacher at the classroom level. However, we assume that students in our sample—those
who went on to take introductory science courses—are typically well above average in
their science abilities. Thus we would expect our depth measure, if it really signified a
remedial scenario, to have a negative impact, if any, on student performance in introductory
college science courses. As it was, we observed the opposite (i.e., depth was associated
with higher performance, even when controlled for student achievement in high school).
This argument suggests that the remedial depth scenario is unlikely.
Another issue in this analysis is the degree to which we might expect other variables in
the FICSS survey to correlate with depth and breadth as defined. We identified one such
item in our survey: “How would you best describe learning the material required in your
[high school science] course?” The respondents were provided with a 5-point rating scale
ranging from “A lot of memorization of facts” to “A full understanding of topics.” This
question is constructed in a manner that clearly alludes (at one end of the scale) to a type
of memorization that presumes a cursory or superficial level of understanding. This was
certainly our intention when we wrote this question. However, Ramsden (2003) notes that
the act of memorization may help learners deconstruct the structure and connections in the
body of information they are attempting to commit to memory. Thus, is memorization a
feature of both depth and breadth in learning? On the basis of this more carefully considered
characterization of memorization and taking into account the tone and clear intention of
this questionnaire item, we hypothesized that the responses to this item would have a
weak positive correlation, if any, with depth, and a weak negative correlation, if any, with
breadth. In the analysis, we found the correlation with depth was r = .15 (very weak) and
the correlation with breadth was r = −.09, a result commonly considered “not correlated.”
The results in this case suggest that memorization (as a new variable) is orthogonal with
our definition of breadth and depth and is not subsumed by depth or breadth. However, it
remains to be seen how well our definitions hold up as new variables are identified.
CONCLUSION
The baseline model reveals a direct and compelling outcome: teaching for depth is
associated with improvements in later performance. Of course, there is much to consider
in evaluating the implications of such an analysis. There are a number of questions about
Science Education
26 SCHWARTZ ET AL.
this simple conclusion that naturally emerge. For example, how much depth works best?
What is the optimal manner to operationalize the impact of depth-based learning? Do
specific contexts (such as type of student, teacher, or school) moderate the impact of depth?
The answers to these questions certainly suggest that a more nuanced view should be
sought. Nonetheless, this analysis appears to indicate that a robust positive association
exists between high school science teaching that provides depth in at least one topic and
better performances in introductory postsecondary science courses.
Our results also clearly suggest that breadth-based learning, as commonly applied in
high school classrooms, does not appear to offer students any advantage when they enroll
in introductory college science courses, although it may contribute to higher scores on
standardized tests. However, the intuitive appeal of broadly surveying a discipline in an
introductory high school course cannot be overlooked. There might be benefits to such a
pedagogy that become apparent when using measures that we did not explore. The results
regarding breadth were less compelling because in only one of the three disciplines were
the results significant in our full model. On the other hand, we observed no positive effects
at all. As it stands, our findings at least suggest that aiming for breadth in content coverage
should be avoided, as we found no evidence to support such an approach.
The authors thank the people who made this large research project possible: Janice M. Earle, Finbarr
C. Sloane, and Larry E. Suter of the National Science Foundation for their insight and support; James
H. Wandersee, Joel J. Mintzes, Lillian C. McDermott, Eric Mazur, Dudley R. Herschbach, Brian
Alters, and Jason Wiles of the FICCS Advisory Board for their guidance; and Nancy Cianchetta,
Susan Matthews, Dan Record, and Tim Reed of our High School Advisory Board for their time
and wisdom. This research has resulted from the tireless efforts of many on our research team:
Michael Filisky, Hal Coyle, Cynthia Crockett, Bruce Ward, Judith Peritz, Annette Trenga, Freeman
Deutsch, Nancy Cook, Zahra Hazari, and Jamie Miller. Matthew H. Schneps, Nancy Finkelstein,
Alex Griswold, Tobias McElheny, Yael Bowman, and Alexia Prichard of our Science Media Group
constructed of our dissemination website (www.ficss.org). We also appreciate advice and interest
from several colleagues in the field: Michael Neuschatz of the American Institute of Physics, William
Lichten of Yale University, Trevor Packer of the College Board, Saul Geiser of the University of
California, Paul Hickman of Northeastern University, William Fitzsimmons, Marlyn McGrath Lewis,
Georgene Herschbach, and Rory Browne of Harvard University, and Kristen Klopfenstein of Texas
Christian University. We are indebted to the professors at universities and colleges nationwide who
felt that this project was worth contributing a piece of their valuable class to administer our surveys
and their students’ willingness to answer our questions. Any opinions, findings, and conclusions or
recommendations expressed in this material are those of the authors and do not necessarily reflect
the views of the National Science Foundation, the U.S. Department of Education, or the National
Institutes of Health.
REFERENCES
American Association for the Advancement of Science. (1989). Project 2061: Science for all Americans.
Washington, DC: Author.
American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York:
Oxford University Press.
Anaya, G. (1999). Accuracy of self-reported test scores. College and University, 75(2), 13 – 19.
Anderson, C. W. (2004). Science education research, environmental literacy, and our collective future. National
Association for Research in Science Teaching. NARST News, 47(2), 1 – 5.
Anderson, R. D. (1995). Curriculum reform. Phi Delta Kappan, 77(1), 33 – 36.
Baird, L. (1976). Using self-reports to predict student performance. Research Monograph No. 7. New York:
College Entrance Examination Board.
Science Education
College Board Standards for College Success Page 1 of 2
Close x
Educators - Information & Tools For Teachers, Counselors, Higher Education Faculty and
Administrators Home > K–12 Services > College Board Standards for College Success™ Print Page
provide a model set of comprehensive standards for middle school and high school courses that lead to
college and workplace readiness;
reflect 21st-century skills such as problem solving, critical and creative thinking, collaboration, and media and
technological literacy;
articulate clear standards and objectives with supporting, in-depth performance expectations to guide
instruction and curriculum development;
provide teachers, districts and states with tools for increasing the rigor and alignment of courses across
grades 6-12 to college and workplace readiness; and
assist teachers in designing lessons and classroom assessments.
The standards advisory committees relied on college-readiness evidence gathered from a wide array of sources to
design and develop the CBSCS. These sources include national and international frameworks such as National
Assessment of Educational Progress (NAEP), Programme for International Student Assessment (PISA), and Trends
in International Mathematics and Science Study (TIMSS); results of surveys and course content analyses from
college faculty regarding what is most important for college readiness; assessment frameworks from relevant AP
exams, the SAT®, the PSAT/NMSQT®, and College Level Examination Program® (CLEP®) exams, and selected
university placement programs.
Beginning with the end goal in mind, the committees first defined the academic demands students will face in AP or
first-year college courses in English, mathematics and statistics, and science. After identifying these demands, the
committees then backmapped to the start of middle school to outline a vertical progression, or road map, of critical
thinking skills and knowledge students need to be prepared for college-level work.
http://professionals.collegeboard.com/portal/site/Professionals/menuitem.b6b1a9bc0c561... 10/21/2009
College Board Standards for College Success Page 2 of 2
The College Board also uses the CBSCS to align our own curriculum and assessment programs, including
SpringBoard®, to college readiness.
Learn More
Please contact Standards and Curriculum Alignment Services at Standards_Requests@collegeboard.org or your
regional College Board representative for more information on the CBSCS and our alignment services.
Science (.pdf/4.4M)
English Language Arts (.pdf/1.9M)
Mathematics & Statistics (.pdf/535K)
Mathematics & Statistics Adapted for Integrated Curricula (.pdf/368K)
Mathematics & Statistics Three-Year Alternative for Middle School (.pdf/121K)
http://professionals.collegeboard.com/portal/site/Professionals/menuitem.b6b1a9bc0c561... 10/21/2009
Jane Jackson, Larry Dukerich, David Hestenes
Introduction and rural areas. Modeling Workshops nearly all of whom had physics the
Modeling Instruction is an evolving, at Arizona State University (ASU) are previous year. When asked to formally
research-based program for high the cornerstone of a graduate program present information to the class about
school science education reform for teachers of the physical sciences. controversial topics such as cloning
that was supported by the National Recently, Modeling has expanded to or genetically modified organisms, it
Science Foundation (NSF) from embrace the entire middle/high school is delightfully clear how much more
1989 to 2005. The name Modeling physical science curriculum. The articulate and confident they are.”
Instruction expresses an emphasis program has an extensive web site at Students in modeling classrooms
on the construction and application http://modeling.asu.edu. experience first-hand the richness and
of conceptual models of physical excitement of learning about the natural
Product: Students Who Can
phenomena as a central aspect of world. One example comes from
Think
learning and doing science (Hestenes, Phoenix modeler Robert McDowell.
Modeling Instruction meets or He wrote that, under traditional
1987; Wells et al, 1995; Hestenes,
exceeds NSES teaching standards, instruction, “when asked a question
1997). Both the National Science
professional development standards, about some science application in a
Education Standards (NRC, 1996)
assessment standards, and content and movie, I might get a few students who
and National Council of Teachers
inquiry standards. would cite 1-2 errors, but usually with
of Mathematics Standards (NCTM),
Modeling Instruction produces uncertainty. Since I started Modeling,
as well as Benchmarks for Science
students who engage intelligently in the students now bring up their own
Literacy (AAAS, 1993) recommend
public discourse and debate about topics … not just from movies, but
“models and modeling” as a unifying
matters of scientific and technical their everyday experiences.” One of
theme for science and mathematics
concern. Betsy Barnard, a modeler his students wrote, “Mr. McDowell, I
education. To our knowledge, no other
in Madison, Wisconsin, noticed was at a Diamondback baseball game
program has implemented this theme
a significant change in this area recently, and all I could think of was
so thoroughly.
after Modeling was implemented all the physics problems involved.”
From 1995 to 1999, 200 high
in her school: “I teach a course in A former student of another modeler,
school physics teachers participated in
biotechnology, mostly to seniors, Gail Seemueller of Rhode Island,
two four-week Leadership Modeling
Workshops with NSF support. Since described it as follows: “She wanted us
that time, 2500 additional teachers from to truly LEARN and more importantly
48 states and a few other nations have UNDERSTAND the material. I was
Students in modeling engaged. We did many hands-on
taken summer Modeling Workshops at
universities in many states, supported
classrooms experience experiments of which I can still vividly
largely by state funds. Participants first-hand the richness and remember, three years later.”
include teachers from public and excitement of learning about Kelli Gamez Warble of rural
private schools in urban, suburban, the natural world. Arizona, who has taught physics and
10 Science Educator
calculus for a decade using Modeling assess student understanding in more
Instruction, has had numerous students meaningful ways and experiment with
whose career choices were influenced Students present and justify more authentic means of assessment;
by Modeling Instruction. She wrote their conclusions in oral and to continuously improve and update
about discovering several former written form, including the instruction with new software,
students when visiting an ASU formulation of a model for curriculum materials, and insights
Engineering Day, and all but one were the phenomena in question from educational research; and
females. She wrote, “As a former and an evaluation of the to work collaboratively in action
female engineering student myself, research teams to mutually improve
I was gratified but not surprised.
model by comparison with their teaching practice. Altogether,
Modeling encourages cooperation data. Modeling Workshops provide detailed
and discourse about complicated implementation of the National
ideas in a non-threatening, supportive the physical world. From its inception, Science Education Standards.
environment. Females who view the Modeling Instruction program has The modeling cycle, student
science, engineering, and technology been concerned with reforming high conceptions, discourse
as fields encouraging cooperation school physics teaching to make it Instruction is organized into
and supportiveness will, I believe, more coherent and student-centered, modeling cycles rather than traditional
become much more attracted to these and to incorporate the computer as an content units. This promotes an
non-traditional areas.” essential scientific tool. integrated understanding of modeling
Excellence In a series of intensive workshops processes and the acquisition of
over two years, high school teachers coordinated modeling skills. The
Many modeling teachers have
learn to be leaders in science teaching two main stages of this process
been recognized nationally. For
reform and technology infusion in are model development and model
example, three users of Modeling
their schools. They are equipped deployment.
Instruction have received the National
with a robust teaching methodology The first stage—model
Science Teachers Association (NSTA)
for developing student abilities to development—typically begins
Shell Science Teaching Award.
make sense of physical experience, with a demonstration and class
Numerous modelers have received
to understand scientific claims, to discussion. This establishes a common
the Presidential Award for Excellence
articulate coherent opinions of their understanding of a question to be
in Math and Science Teaching
own and defend them with cogent asked of nature. Then, in small groups,
(PAEMST). At Modeling Workshops
arguments, and to evaluate evidence students collaborate in planning and
and related graduate courses for
in support of justified belief. conducting experiments to answer or
science teachers at ASU, as well as
More specifically, teachers learn clarify the question. Students present
at Modeling Workshops across the
to ground their teaching in a well- and justify their conclusions in oral and
United States, teachers learn to impact
defined pedagogical framework written form, including the formulation
students of various backgrounds and
(modeling theory; Hestenes, 1987), of a model for the phenomena in
learning styles. Modeling Workshops
rather than following rules of thumb; question and an evaluation of the
and classrooms are thriving centers of
to organize course content around model by comparison with data.
interactive engagement.
scientific models as coherent units Technical terms and representational
The Essence of Modeling of structured knowledge; to engage tools are introduced by the teacher as
Instruction students collaboratively in making they are needed to sharpen models,
The Modeling method of instruction and using models to describe, explain, facilitate modeling activities, and
corrects many weaknesses of the predict, design, and control physical improve the quality of discourse. The
traditional lecture-demonstration phenomena; to involve students in teacher is prepared with a definite
method, including the fragmentation using computers as scientific tools agenda for student progress and guides
of knowledge, student passivity, and for collecting, organizing, analyzing, student inquiry and discussion in that
the persistence of naïve beliefs about visualizing, and modeling real data; to direction with “Socratic” questioning
12 Science Educator
That models and modeling should human thought; we use metaphors so
be central to an inquiry-based approach frequently and automatically that we
to the study of physics is no surprise. seldom notice them unless they are The Modeling method
The NSES state, “Student inquiries called to our attention. Metaphors are stresses developing
should culminate in formulating used to structure our experience and a sound conceptual
an explanation or model… In the thereby make it meaningful. A major understanding through
process of answering the questions, the objective of teaching should therefore graphical and diagrammatic
students should engage in discussions be to help students “straighten out” representations before
and arguments that result in the their metaphors. In Modeling, instead
revision of their explanations.” of designing the course to address
moving on to an algebraic
Traditional instruction often specific “naïve conceptions,” the treatment of problem
overlooks the crucial influence of instructor focuses on helping students solving.
students’ personal beliefs on what construct appropriate models to
they learn. Force Concept Inventory account for the phenomena they study. variables, the instructor asks the
(FCI) data (Hestenes et al., 1992) show When students learn to correctly students to help design the experiment.
that students are not easily induced identify a physical system, represent A general procedure is negotiated so
to discard their misconceptions in it diagrammatically, and then apply that students have a sense of why
favor of Newtonian concepts. Some the model to the situation they are they are doing a particular procedure.
educational researchers have expended studying, their misconceptions tend Frequently, multiple procedures arise
considerable effort in designing and to fall away (Wells et al., 1995). as students have access to equipment
testing teaching methods to deal with A key component of this approach is that allows them to explore the
specific misconceptions. Although that it moves the teacher from the role relationship in different ways. Lab
their outcomes have been decidedly of authority figure who provides the teams are then allowed to begin
knowledge to that of a coach/facilitator collecting data.
who helps the students construct their Students have to make sense of the
own understanding. Since students experiment themselves. The instructor
In Modeling, instead of systematically misunderstand most must be prepared to allow them to fail.
designing the course to of what we tell them (due to the fact The apparatus should be available
address specific “naïve that what they hear is filtered through for several days, should they need it.
conceptions,” the instructor their existing mental structures), Students use spreadsheet and graphing
focuses on helping students the emphasis is placed on student software to help them organize and
articulation of the concepts. analyze their data. After allowing time
construct appropriate to prepare whiteboards to summarize
Laboratory experiences are centered
models to account for the their findings, the instructor selects
on experiments that isolate one
phenomena they study. concept, with equipment that enables certain groups to present an oral
students to generate good data reliably. account of the group’s experimental
better than the traditional ones, Students are given no pre-printed procedure and interpretation. Students
success has been limited. Many have list of instructions for doing these use multiple representations to present
concluded that student beliefs are so experiments. Rather, the instructor their findings, including concise
“deep-seated” that heavy instructional introduces the class to the physical English sentences, graphs, diagrams,
costs to unseat them are unavoidable. system to be investigated and engages and algebraic expressions.
However, documented success with students in describing the system until After some initial discomfort with
the Modeling method suggests that an a consensus is achieved. The instructor making these oral presentations,
indirect treatment of misconceptions is elicits from students the appropriate students generally become more at
likely to be most efficient and effective. dependent and independent variables ease and gradually develop the skills
(Wells et al., 1995) to characterize the system. After of making an effective presentation.
Cognitive scientists have identified obtaining reasoned defenses from They learn how to defend their
metaphors as a fundamental tool of the students for selection of these views clearly and concisely. These
14 Science Educator
Standards (NSES) emphasize that of students at diverse institutions. the normalized gain can range from
“coherent and integrated programs” It is the product of many hours of zero (no gain) to 1 (greatest possible
supporting “lifelong professional interviews that validated distracters, gain). This method of calculating the
development” of science teachers are and it has been subjected to intense gains normalizes the index, so that
essential for significant reform. They peer review. gains of courses at different levels
state that “The conventional view of Including the survey by Hake can be compared, even if their pretest
professional development for teachers (1998), we have FCI data on some scores differ widely.
needs to shift from technical training 30,000 students of 1000 physics
for specific skills to opportunities for teachers in high schools, colleges
intellectual professional growth.” The and universities, throughout the
MNS program at ASU is designed to world. This large data base presents A challenge in physics
meet that need. a highly consistent picture, showing education research for more
that the FCI provides statistically
Evidence of Effectiveness reliable measures of student concept
than a decade has been to
understanding in mechanics. Results identify essential conditions
Evaluation of Physics Instruction.
The Force Concept Inventory (FCI) strongly support the following general for learning Newtonian
was developed to compare the conclusions: physics and thereby devise
effectiveness of alternative methods • Before physics instruction, students more effective teaching
of physics instruction (Halloun et al., hold naive beliefs about motion and methods.
1985a, Hestenes et al., 1992). It has force that are incompatible with
become the most widely used and Newtonian concepts.
influential instrument for assessing • Such beliefs are a major determi- From pre/post course FCI scores of
the effectiveness of introductory nant of student performance in 14 traditional courses in high schools
physics instruction, and has been cited introductory physics. and colleges, Hake found a mean
as producing the most convincing • Traditional (lecture-demonstration) normalized gain of 23%. In contrast,
hard evidence of the need to reform physics instruction induces only a for 48 courses using interactive
traditional physics instruction (Hake, small change in student beliefs. This engagement teaching methods (minds-
1998). result is largely independent of the on always, and hands-on usually),
The FCI assesses students’ instructor’s knowledge, experience he found a mean normalized gain of
conceptual understanding of the and teaching style. 48%. The difference is much greater
force concept, the key concept in than a standard deviation––a highly
mechanics. It consists of 30 multiple • Much greater changes in student
significant result. This would indicate
choice questions, but there is one beliefs can be induced with
that traditional instruction fails badly,
crucial difference between the FCI instructional methods derived
and moreover, that this failure cannot
questions and traditional multiple- from educational research in, for
be attributed to inadequacies of the
choice items: distracters are designed example, cognition, alternative
students because some alternative
to elicit misconceptions known from conceptions, classroom discourse,
methods of instruction can do
the research base. A student must have and cooperative learning.
much better. A challenge in physics
a clear understanding of one of six These conclusions can be quantified. education research for more than a
fundamental aspects of the Newtonian The FCI is best used as a pre/post decade has been to identify essential
force concept in order to select the diagnostic. One way to quantify conditions for learning Newtonian
correct response. The FCI reveals pre/post gains is to calculate the physics and thereby devise more
misconceptions that students bring normalized gain (Hake, 1998). This is effective teaching methods. Modeling
as prior knowledge to a class, and it the actual gain (in percentage) divided Instruction resulted from pursuing this
measures the conceptual gains of a by the total possible gain (also in challenge. Results from using the FCI
class as a whole. The FCI is research- percentage). Thus, normalized gain = to assess Modeling Instruction are
grounded, normed with thousands (%post - %pre)/(100 - %pre). Hence, given below.
Pre-test
The average FCI pretest 69 discussions were very rich.”
score was about 26%, slightly 60 “The interviews confirmed my
above the random guessing 52 observations about the nature of
level of 20%. Figure 1 40 42 the group. All the participants were
shows that traditional high articulate about physics instruction
26 26 29
school instruction (lecture, 20 and the modeling approach. The
demonstration, and standard participants reported being pleased
laboratory activities) has little with everything that was happening
Traditional Novice Expert
impact on student beliefs, Modelers Modelers at the workshop and spoke in glowing
with an average FCI posttest terms about the facilitators. The
score of 42%, well below the 60% participating Arizona teachers do not participants felt that the workshop
score which, for empirical reasons, have a degree in physics. Teachers who was well run and that the facilitators
can be regarded as a threshold in implement the Modeling method most were extremely knowledgeable about
understanding Newtonian mechanics. fully have the highest student posttest how best to teach physics. They also
This corresponds to a normalized gain FCI mean scores and gains. commented that the group itself was
of 22%, in agreement with Hake’s an excellent resource. They all could
External evaluation of Modeling
results. help each other.”
Instruction. The Modeling Instruction
After their first year of teaching “I have almost never seen such
Program was assessed by Prof. Frances
with the Modeling method, posttest overwhelming and consistent
Lawrenz, an independent external
scores for 3394 students of 66 novice support for a teaching approach. It
evaluator for the National Science
modelers were about 10 percentage is especially surprising within the
Foundation. We quote at length from
points higher, as shown in Fig. 1. physics community, which is known
the four page report on her July 22,
Students of expert modelers do much for its critical analysis and slow
1998 site visit, because it describes
better. For 11 teachers identified acceptance of innovation. In short,
the character of the workshops very
as expert modelers after two years the modeling approach presented by
well.
in the program, posttest scores for the project is sound and deserves to
“This site visit was one of several
647 students averaged 69%. This be spread nationally.”
over the past four years to sites
corresponds to a normalized gain of
in the modeling project … I had Recognition by U.S. Department of
56%, considerably more than double
the opportunity to observe the Education. In September 2000, the
the gain under traditional instruction.
participants working in their concept U.S. Department of Education
After two years in the program, gains
groups and presenting to the full announced that the Modeling
for under-prepared teachers were
group. I also interviewed most of Instruction Program at Arizona State
comparable to gains in one year for
the participants, either as part of University is one of seven K-12
well-prepared teachers. Subsequent
a small group or individually, and educational technology programs
data have confirmed all these results
interviewed two coordinators of the designated as exemplary or promising,
for 20,000 students (Hestenes, 2000).
workshop … The workshop appears out of 134 programs submitted to the
Thus, student gains in understanding
16 Science Educator
providing schools and school districts Hake, R. (1998). Interactive-engage-
with a valuable resource for broader ment vs. traditional methods: A six
The Modeling method has reform. thousand-student survey of mechan-
a proven track record of Data on some 20,000 students show ics test data for introductory physics
improving student learning. courses. American Journal of Physics
that those who have been through the
66, 64-74.
Modeling program typically achieve Halloun, I., & Hestenes, D. (1985a). Initial
twice the gains on a standard test of knowledge state of college physics
agency’s Expert Panel. Selections conceptual understanding as students students, American Journal of Physics
were based on the following criteria: who are taught conventionally. 53, 1043-1055.
(l) Quality of Program, (2) Educational Further, the Modeling method is Halloun, I., & Hestenes, D. (1985b).
Significance, (3) Evidence of successful with students who have Common sense concepts about mo-
Effectiveness, and (4) Usefulness not traditionally done well in physics. tion, American Journal of Physics 53,
to Others. In January 2001, a U.S. Experienced modelers report increased 1056-1065.
Department of Education Expert Panel enrollments in physics classes, Hestenes, D. (1987). Toward a modeling
in Science recognized the Modeling parental satisfaction, and enhanced
theory of physics instruction, American
Instruction Program as one of only Journal of Physics 55, 440-454.
achievement in college courses across Hestenes, D., Wells, M., & Swackhamer,
two exemplary K-12 science programs the curriculum.
out of 27 programs evaluated (U.S. G. (1992). Force concept inventory, The
Carmela Minaya of Honolulu, Physics Teacher 30, 141-158.
Department of Education, 2001). NSTA Shell Science Teaching Hestenes, D. (1997). Modeling methodol-
Long-term implementation. In a Awardee, wrote: “The beauty of ogy for physics teachers. In E. Redish
follow-up survey of Leadership Modeling Instruction is that it creates & J. Rigden (Eds.) The changing role
Modeling Workshop graduates, an effective framework for teachers of the physics department in modern
between one and three years after to incorporate many of the national universities. American Institute of
standards into their teaching without Physics. Part II, 935-957.
they had completed the program,
Hestenes, D. (2000). Findings of the mod-
75% of them responded immediately having to consciously do so, because
eling workshop project, 1994 - 2000.
and enthusiastically. More than 90% the method innately addresses many One section of an NSF final report.
reported that the Workshops had a of the various standards. In a certain Online in pdf format at http://modeling.
highly significant influence on the course, different students may learn asu.edu/R&E/Research.html.
way they teach. 45% reported that the same material, except they never Wells, M., Hestenes, D., & Swackhamer,
their use of Modeling Instruction has learn it in exactly the same way. This G. (1995). A modeling method for high
continued at the same level, while method appeals to all learning styles. school physics instruction, American
another 50% reported an increase. Students cling to whatever works for Journal of Physics 63, 606-619.
(Hestenes, 2000). them. Although the content is identical
Jane Jackson is co-director, Modeling
yearly, no two students ever are. The
Final Thoughts classroom becomes a dynamic center
Instruction Program, Department of Physics,
Arizona State University, PO Box 871504,
Instead of relying on lectures and for student owned learning.” Tempe, AZ 85287. Correspondence pertaining
textbooks, the Modeling Instruction to this article may be sent to Jane.Jackson@
program emphasizes active student References asu.edu.
construction of conceptual and [All references by David Hestenes are Larry Dukerich is director of Professional
mathematical models in an interactive online in pdf format at http://modeling. Development in Science, Center for Research
learning community. Students are asu.edu/R&E/Research.html] in Education in Science, Mathematics,
engaged with simple scenarios to U.S. Department of Education (2001). Engineering, and Technology (CRESMET),
learn to model the physical world. 2001 Exemplary and Promising Sci- Arizona State University, Tempe, AZ.
Modeling cultivates physics teachers ence Programs. Retrieved Dec. 10,
as school experts on the use of 2007 at http://www.ed.gov/offices/ David Hestenes is distinguished research
OERI/ORAD/KAD/expert_panel/ professor, Department of Physics, Arizona
technology in science teaching, and State University, Tempe, AZ
math-science.html
encourages teacher-to-teacher training
in science teaching methods, thereby
Students become sports broadcasters for PBS. They must compose a voice-over dub
for a sports video of their choice. Their video must convey the excitement of the
sport and the physics principles observed. To gain knowledge and understanding of
physics principles necessary to meet this challenge, students work collaboratively on
activities in which they apply concepts of Newton's laws, forces, friction, and
momentum to sporting events.
Students are challenged to design or build a safety device, or system, for protecting
automobile, airplane, bicycle, motorcycle, or train passengers. New laws, increased
awareness, and improved safety systems are explored as students work on this
challenge. They are also encouraged to design improvements to existing systems
and to find ways to minimize harm caused by accidents. To meet this challenge,
students engage in collaborative activities that explore motions and forces and the
principles of design technology.
Students are presented with a “myth” or story about some physical situation (For
example, “The winner of a Tug-of-war is the strongest team.”) They will work in
teams to design, build, run, and analyze experiments that will test the physical ideas
central to the myth. Students will then extend their experiments, exploring the
physical limits of the experiment and applying it to real-world situations. Teams will
then present their findings to the class (and to the world!) in the form of a
“Mythbusters” video and make conclusions as to whether the situation described by
the myth is physically plausible or even possible.
Students will design and build their own musical instrument from available household
items. Students must analyze the tonal quality for the notes played and make a
comparison to the tonal quality of several musical instruments. To gain
understanding of science principles necessary to meet this challenge, students work
collaboratively on activities to learn about wave motion, standing waves, sound
waves, resonance, timbre, and hearing. They also learn to use the iterative process
of engineering design, refining designs based on the physics they learn.
Dear Physics Students and Parents,
During this year, you will be learning information related to many topics in physics. In this course,
a non-traditional criterion-based assessment system is used. Rather than using a point system to
record scores on various assignments, quizzes and tests, you must demonstrate your level of
proficiency on various learning goals. For example, instead of getting one grade for a worksheet
that may cover many topics, you will be scored on individual learning goals such as, “I can
interpret/draw position vs. time graphs for an object moving with constant velocity.” Rubrics that
list the learning goals and define achievement levels of each will be provided to you and used for
assessment.
Grades will be assigned based on descriptors of achievement only. Grades will not be affected by
issues such as lateness of work, effort, attitude, participation, and attendance. Those factors will be
reported separately. Even though there will be many opportunities for cooperative learning, you
will never be assigned group grades.
New information showing additional learning will replace old information. Grades will reflect the
trend in most recent learning. You may re-attempt assessments provided that you have
documented an effort to engage in additional learning (e.g., tutoring, additional practice, test
corrections, etc.). In addition, an alternative assessment may be accepted if the work is pre-
contracted between you (the student) and me (the teacher). There is no deadline for
reassessments. Any grade changes due to reassessments made after the quarter ends will be
reflected in the final year-end grade.
For each learning goal and unit of study, you will be scored using the following scale:
0 = No Basis
I do not provide any responses for which a judgment can be made about my understanding.
Near the end of each quarter, you (the student) will meet with me (the teacher) individually to
discuss your progress and to assign a quarterly grade. Quarterly grades will be given as follows:
98% 78%
Above Standard Approaching Standard
95% I am advanced in at least one unit of 75% I am developing in most units of study
study and proficient in all others. and proficient in at least one.
92% 72%
Year-end course grades will be assigned the same way, looking at the final achievement on the
entire year’s worth of learning goals.
Please sign below to indicate you have read and understand this grading system. If you have any
questions, please email me at fnoschese@klschools.org, leave a voicemail at 763-7200 x9512, or
simply write them in the box below. Thank you.
Questions/Comments:
THE BUGGY LAB
I. PROBLEM
Design an experiment to determine the relationship between the
position of a toy buggy and time.
II. DESIGN
___ Describe your experimental design with both a labeled diagram
and a verbal description.
___ State what the independent and dependent variables are.
___ Include how will you vary and measure your chosen parameters.
III. DATA
___ Perform the experiment and record your measurements in an
appropriate table.
IV. ANALYSIS
___ Graph and determine the numerical model for your data set.
V. CONCLUSION
___ What pattern did you find from your observations? Write a verbal
and mathematical description.
___ What is the physical significance of the slope of the graph?
___ What is the physical significance of the y-intercept?
___ Turn your numerical model into a general model.
You will be assessed using the rubric on the back of this page.
BUGGY LAB RUBRIC
INTERIM PROGRESS
QUARTER FINAL
LAB.1 – I can identify the hypothesis to be
tested, phenomenon to be investigated, or
the problem to be solved.
LAB.2 – I can design a reliable experiment
that tests the hypothesis, investigates the
phenomenon, or solves the problem.
LAB.3 – I can communicate the details of an
experimental procedure clearly and
completely.
UNIT SCORE
MOMENTUM CONSERVATION AND TRANSFER
LEARNING GOALS
INTERIM PROGRESS
QUARTER FINAL
MOM.1 – I can determine whether
interactions are present for a given situation
by considering the motion of objects.
MOM.2 – I can calculate the momentum of
an object/system with direction and proper
units.
UNIT SCORE
1ST 2ND 3RD 4TH INTERIM PROGRESS
LEARNING GOALS
UNIT SCORE
INTERIM GRADE: %
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
Questions/Comments:
1ST 2ND 3RD 4TH QUARTER FINAL
LEARNING GOALS
UNIT SCORE
QUARTER GRADE: %
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
______________________________________________________________________
Questions/Comments: