Download as pdf or txt
Download as pdf or txt
You are on page 1of 62

impacts

Science’s new social contract


with society
Michael Gibbons

Under the prevailing contract between science and society, science has
been expected to produce ‘reliable’ knowledge, provided merely that it
communicates its discoveries to society. A new contract must now ensure
that scientific knowledge is ‘socially robust’, and that its production is seen
by society to be both transparent and participative.
These trends can be observed interna­

M
odern science has until recently

AP
flourished partly because of a stable, tionally, even if their precise form and timing
underlying agreement between its has varied between countries. Cumulatively,
practitioners and the rest of society. In other they signal the end of the institutional
words, there has been a social contract arrangements through which science flour­
between science and society, an arrangement ished during and after the Second World
built on trust which sets out the expectations War, and thus mark the expiry of the social
of the one held by the other, and which — in contract between science and society that
principle — includes appropriate sanctions has dominated this period. A new social
if these expectations are not met. contract is now required. This cannot be
This social contract has been made up of achieved merely by patching up the existing
several individual elements, reflecting framework. A fresh approach — virtually a
broader contracts between government and complete ‘rethinking’ of science’s relation­
society, between industry and society, and Traditional boundaries between university and ship with the rest of society — is needed.
between higher education and society. The industrial science, and between basic and
contract between university science and applied research, are disappearing. As a result, Reflecting complexity and diversity
society, for example, has been based science and society are invading each other’s One aspect of this new contract is that it
traditionally on the understanding that domain, requiring a rethinking of previous needs to reflect the increasing complexity of
universities will provide research and responsibilities. modern society. For example, there are no
teaching in return for public funding and a longer clear demarcation lines between
relatively high degree of institutional auton­ university science and industrial science,
omy; under this contract, the universities, for example, have moved many government between basic research, applied research and
often supported through research-funding research establishments into the market product development, or even between
agencies, have been expected to generate place. With the relaxation of the Cold War, careers in the academic world and in indus­
fundamental knowledge for society, and to governments have shifted their priorities try. There is now greater movement across
train the highly qualified manpower from security and military objectives to institutional boundaries, a blurring of
required by an advanced industrial society. maintaining international competitiveness professional identities and a greater diversity
By contrast, the contract with industrial and enhancing the quality of life. And many of career patterns.
research and development (R&D) has been long-established industries have been But the price of this increased complexity
based on an understanding that industry denationalized, while in many countries is a pervasive uncertainty. One way of
would provide for the appliance of science companies previously dependent upon looking at this is in terms of an erosion of
through the work of its laboratories, and government for R&D support through society’s stable categorizations, namely the
thus carry the discoveries of basic science military technology projects have had to find state, market, culture and science. Alterna­
into product and process innovations. In these resources elsewhere, or in partnership tively, it can be seen as the cumulative effect
turn, government science was meant to use with others, to compete in international of parallel evolutionary processes. For there
research establishments to fill the gap markets. has been a co-evolution in both society and
between university science and industrial Meanwhile the expansion of higher science in terms of the range of organizations
R&D. The understanding has been that the education has been accompanied by a with which researchers are prepared to
state has been directly responsible for carry­ culture of accountability that has impacted work, the colleagues with whom they collab­
ing out research related to national need; for on both teaching and research. In research, orate, and topics considered interesting.
example, in defence, energy, public health many academics have had to accept Whatever viewpoint one takes, science is
and standards. objective-driven research programmes, now produced in more open systems of
For most of the twentieth century, univer­ whereas research funding agencies have been knowledge production.
sities, government research establishments increasingly transformed from primarily One consequence is that the norms and
and industrial laboratories have therefore responsive institutions, responsible for practices of research in university and indus­
operated relatively independently, develop­ maintaining basic science in the universities, trial laboratories have converged. There are
ing their own research practices and modes into instruments for attaining national still differences between universities and
of behaviour. Recently, however, this relative technological, economic and social priorities industry, but these do not impact on what is
institutional impermeability has gradually through the funding of research projects considered sound scientific practice1.
become more porous. Privatization policies, and programmes. Indeed, science and society more generally
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C81
impacts

aspects. First, it is valid not only inside but


GREENPEACE

also outside the laboratory. Second, this


validity is achieved through involving an
extended group of experts, including lay
‘experts’. And third, because ‘society’ has
participated in its genesis, such knowledge is
less likely to be contested than that which is
merely ‘reliable’.

Socially robust knowledge


My argument is that we are currently
witnessing a significant shift from ‘reliable’
to ‘socially robust’ knowledge. Three obser­
vations can immediately be made. The first
is that the basic conditions and processes
that have underpinned the production of
‘reliable knowledge’ are not necessarily
compromised by the shift to ‘socially robust
knowledge’. Indeed, if these conditions and
processes have been undermined, it may
have been as much by the narrow outlook of
Both pressure groups and ordinary consumers are demanding that the debate surrounding the health much scientific practice as by any attempt to
implications of GMOs be broadened to include the perspectives of the non-expert community. widen the range of stakeholders, or more
systematically to take into account the
context in which science is produced.
have each invaded the other’s domain, and its organization, division of labour and The second observation is that reliable
the lines demarcating the one from the other day-to-day practices, and also in its episte­ knowledge has always only been reliable
have virtually disappeared. mological core. within boundaries. Science was recognized
As a result, not only can science speak to In relation to the former, for example, as inherently incomplete because it is,
society, as it has done so successfully over the research carried out in both industrial and primarily, a method rather than a final
past two centuries, but society can now government laboratories, as well as the fund­ answer. But to achieve a reasonable degree of
‘speak back’ to science. The current contract ing policies of research-funding agencies, reliability, the problem terrain had also to be
between science and society was not only have opened up to a wide range of socioeco­ circumscribed, and judgements on what is
premised on a degree of separation between nomic demands, admitting more and more included there restricted to those of a peer
the two, but also assumed that the most cross-institutional links, and thus altering group, rather than opened to the scientific
important communication was from science the balance between the different sources of community as a whole.
to society. Science was seen as the fountain­ funding of academic research. Thus in Both aspects of reliable knowledge are
head of all new knowledge and, as part of the ‘speaking back’ to science, society is carried forward into socially robust knowl­
contract, was expected to communicate its demanding various innovations, for edge. But although knowledge remains
discoveries to society. Society in turn did example the pursuit of national objectives, incomplete, this is no longer only in the
what it could to absorb the message and the contribution to new regulatory regimes conventional sense that it will eventually be
through other institutions — primarily and acknowledgement of the multiplication superseded by superior science; rather
industry — to transform the results of of user–producer interfaces. it means that it may be sharply contested,
science into new products and processes. In relation to the latter, the epistemologi­ and no longer remains within the controlled
Science was highly successful working in cal dimension, the increasing importance of environment of scientific peers. This shift
this mode, and for as long as it delivered the ‘context’ is also reflected in a relatively rapid involves renegotiating and reinterpreting
goods, its autonomy was seldom contested. shift within science from the search for boundaries that have been dramatically
Yet this success has ironically itself been ‘truth’ to the more pragmatic aim of provid­ extended, so that science can no longer
instrumental in changing its relationship ing a provisional understanding of the not be validated as reliable by conventional
with society, drawing science into a larger empirical world that ‘works’2. John Ziman, discipline-bound norms; while remaining
and a more diverse range of problem areas, former physicist and long-time contributor robust, science must now be sensitive to a
many lying outside traditional disciplinary to social studies of science, has described much wider range of social implications.
boundaries. It is this increasingly intense science as a form of ‘reliable knowledge’ that An example is the current debate
involvement of science in society over the becomes established not in terms of an surrounding genetically modified organ­
past half a century that has created the abstract notion of objectivity but, concretely, isms (GMOs). Here, specialist peer groups
conditions that underpin the growing com­ in terms of the replicability of research have been challenged not only by pressure
plexity and the pervasive uncertainty in statements and the formation of a consensus groups but also by ordinary consumers,
which we live, and encouraged the social and within the relevant peer group3. Reliable for whom the research process is far
behavioural experiments described above. knowledge is therefore defined as such from transparent, and who are demanding
But if it is widely recognized that science because it ‘works’. that it be more so. Here, knowledge of the
is transforming modern society, it is less But what ‘works’ has now acquired a health implications of GMOs may be
often appreciated that society, in speaking further dimension that can best be described ‘reliable’ in the conventional scientific sense;
back, is transforming science. I will use the as a shift from ‘reliable knowledge’ to what but it is not socially robust, and will
term ‘contextualization’ to describe this Nowotny et al. call ‘socially robust’ knowl­ not become so until the peer group is broad­
process, and ‘contextualized knowledge’ as edge4. The latter characterization is intended ened to take into account the perspectives
the outcome of this reverse communication. to embrace the process of contextualization. and concerns of a much wider section of the
Contextualization affects modern science in For ‘socially robust’ knowledge has three community.
C82 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts
A failure to persuade fragmentation. Such narratives are challeng­
US DEPT OF ENERGY

the broad public of the ing to their participants. Experts must


value of the US respond to issues and questions that are never
Superconducting Super merely scientific and technical, and must
Collider research address audiences that never consist only of
programme may have other experts. The limits of competence of
contributed to the the individual expert call for the involvement
collapse of funding for of a wide base of expertise that has to be care­
the project. fully orchestrated if it is to speak in unison.
Since expertise now has to bring together
knowledge that is itself distributed, contex­
tualized and heterogeneous, it cannot arise
at one specific site, or out of the views of one
scientific discipline or group of highly
respected researchers. Rather it must emerge
from bringing together the many different
‘knowledge dimensions’ involved. Its
authority depends on the way in which such
a collective group is linked, often in a
self-organized way. Breakdowns in social
authority arise when links are inadequately
established, as has occurred in European
debates over GMOs.

Rethinking science
These four inter-related processes — co­
evolution, contextualization, the produc­
tion of socially robust knowledge and the
construction of narratives of expertise —
form a framework both for rethinking
science and for understanding any new
social contract between society and science.
Co-evolution denotes an open interaction
between science and society which generates
There was also a degree of contestation in industry and universities into the ‘agora’ – the variety through experimentation, whether
the United States about the value of the public space in which both ‘science meets the in scientific problems, colleagues or institu­
Superconducting Super Collider (SSC), public’, and the public ‘speaks back’ to tional designs, with the selective retention of
plans for which were dropped in 1992. In this science. This is a space in which the media is certain choices, modes or solutions. This is
case, however, unlike the case with GMOs, increasingly active, and in which the new so even while these experimental approach­
there was no spontaneous backlash from communication technologies play a promi­ es, in responding to uncertainty and
society generally about the value of the nent role. It is also the domain in which complexity, not only promote permeability
knowledge. Rather, it has been argued that contextualization occurs. Neither state nor but also generate more complexity and
the collapse of funding for the project was a market, neither exclusively private nor exclu­ uncertainty, thus encouraging further
result of the unwillingness (or inability) of a sively public, the agora is where today’s soci­ experimentation.
narrow disciplinary group to extend its etal and scientific problems are framed and Greater permeability provides the basis
boundaries sufficiently to persuade other defined, and their ‘solutions’ are negotiated. for increased contextualization by increasing
scientists and politicians that the research the routes through which society can ‘speak
would be of wide benefit5. Again we see a Narratives of expertise back’ to science. Denser communication
failure to achieve sufficient social robustness The factor that has come to the fore in the itself brings an imperative to produce social­
in the research process, however reliable it agora is the role of scientific and technical ly robust knowledge that is seen as valid not
may be in its own terms. expertise that is so crucial to decision making only inside but also outside the walls of the
The third observation is that the episte­ in highly industrialized societies. This role is laboratory, in terms of being accepted as
mological core of science has, over time, changing as expertise spreads throughout legitimate.
become crowded with norms and practices society, resulting in the fragmentation of As the walls of laboratories have opened
that cannot be reduced easily to a single established links between expertise and up, more and more scientists have taken their
generic methodology, or, more broadly, to institutional structures, whether of govern­ places as actors in the agora, broadening the
privileged cultures of scientific inquiry. ment, industry or the professions. Further­ range of experts whose view might be sought
There is no one set of practices that describe, more, the questions asked of experts are on a particular problem or issue. To cope
much less lead to, good science. The case neither the same, nor simple extensions of, with this, a further development in the use of
for science can still be made in essentially the ones that arise in their specialist fields of scientific and technical experts is needed. For
functionalist terms; but many more factors study. Experts must now extend their knowl­ reliable knowledge can only become socially
now need to be taken into account before a edge to widely disparate areas, and try to robust if society sees the process of knowl­
solution that ‘works’ can be adopted. integrate what they ‘know’ now with what edge production as transparent and partici­
One outcome of all these changes is that others want to ‘do’ in the future. pative. The old image of science working
the sites at which problems are formulated Collective narratives of expertise need to autonomously will no longer suffice. Rather,
and negotiated have moved from their previ­ be constructed to deal with the complexity a reciprocity is required in which not only
ous institutional locations in government, and the uncertainty generated by this does the public understand how science
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C83
impacts
works but, equally, science understands how A second need is for the process of internalize accountability. Indeed there is an
its publics work. contextualization to be internalized. The analogy between the relationships between
The process of rethinking science, now ‘context of application’ can be managed autonomy and self-organization on the one
that the line that used to separate science and through ‘external’ mechanisms such as hand, and between reliable and socially
society has virtually disappeared, has scarce­ ‘forward look’ and ‘technology foresight’ robust knowledge on the other. In the agora,
ly begun5. But several changes in perspective exercises, and through science and technolo­ the conditions that promote greater self­
must be initiated before a new social contract gy parks and technology transfer or industrial organization also promote the generation of
can emerge. First, the need for contextualiza­ liaison offices within universities. In contrast, socially robust knowledge.
tion means that the (unknowable) implica­ the ‘context of implication’ needs to be inter­
tions as well as the (planned or predictable) nalized by researchers if it is to be effective. It is Conclusion
applications of scientific research have to be expressed through routes, often informal, To summarize, I have argued in this paper
embraced. that cannot easily be incorporated into that the prevailing contract between science
Research activities now transcend the administrative procedures. These new lines of and society was set up to sustain the produc­
immediate context of application, and begin communication need to be encouraged and tion of ‘reliable knowledge’; a new one must
to reach out, anticipate and engage reflexive­ recognized institutionally. This cannot be ensure the production of ‘socially robust
ly with those further entanglements, conse­ done by communications experts — contex­ knowledge’. The prevailing contract is gov­
quences and impacts that it generates. This tualization is not a public relations exercise — erned by the rules of bureaucratic rationality,
‘context of implication’ always transcends or by asking journalists to develop popular with society linked to ‘people’ primarily
the immediate ‘context of application’ in accounts of the significance of research. through representative institutions. A new
which it occurs. It may embrace neighbour­ A further important point is that the contract will require more open, socially
ing research fields, and as yet obscurely more open and comprehensive the scientific distributed, self-organizing systems of
recognized future uses. Taking the ‘context of community, the more socially robust will be knowledge production that generate their
implication’ seriously opens the door to the knowledge it produces. This is contrary own accountability and audit systems.
those previously excluded from decisions to the traditional assumption that there is a Under the prevailing contract, science was
about research. The individuals now strong relationship between the social and left to make discoveries and then make them
involved may be encountered haphazardly as intellectual coherence (and, therefore, the available to society. A new contract will
individuals, perhaps colleagues or rivals. boundedness) of a scientific community, be based upon the joint production of
They may come from other scientific and the reliability of the knowledge it knowledge by society and science.
disciplines, or from the ‘user’ community. produces. Reliable knowledge may have been A new social contract will therefore
Whatever their origins, scientific knowledge best produced by such cohesive (and there­ involve a dynamic process in which the
will increasingly need to be tested not only fore restricted) scientific communities. But authority of science will need to be legitimat­
against nature, but against (and hopefully socially robust knowledge can only be pro­ ed again and again. To maintain this, science
also with) other people. duced by much more sprawling socio/scien­ must enter the agora and participate fully in
Furthermore, while it is important to tific constituencies with open frontiers. the production of socially robust knowledge.
define problems, and then assemble the At the same time, socially robust knowl­ According to some observers, we can already
intellectual, human and financial resources edge is superior to reliable knowledge both see this approach emerging in the manage­
needed to solve them, this is not, in itself, because it has been subject to more intensive ment of large technology projects. Thomas P.
sufficient to guarantee the reflexivity charac­ testing and retesting in many more contexts Hughes, the eminent American historian of
teristic of socially distributed knowledge — which is why it is ‘robust’ — and also technology, has identified a new ethos
production. In contrast, a process of contex­ because of its malleability and connective among engineers who now recognize that
tualization that attempts to embrace capability. Its context is not predetermined the deeper involvement of communities in
unpredictable and unintended implications or fixed, but open to ceaseless renegotiation. decision making actually produced better
demands reflexivity, as it is intended to Instead of achieving a precarious invariance engineering solutions in a number of
incorporate future potential implications by establishing strict limits within which its projects6. If the boundaries between science,
into the research process from the truthfulness can be tested, as reliable knowl­ technology and society are becoming more
very beginning. It thus goes far beyond a edge does, socially robust knowledge is the permeable, why should not a similar
conventional ‘forward look’ or ‘technology product of an intensive (and continuous) approach in science likewise produce more
foresight’ exercise. interaction between data and other results, socially robust solutions?
This has several consequences. One is the between people and environments, between Michael Gibbons, a former director of the Science
need for strategies to ‘fix’ more accurately the applications and implications. Policy Research Unit at the University of Sussex, is
implications of knowledge production. This It is also clear that science must leave the now secretary-general of the Association of
might be done by identifying areas in which ivory tower and enter the agora. To increase Commonwealth Universities, 36 Gordon Square,
significant implications of particular the effectiveness with which the agora London WC1H 0PF, UK.
research projects are likely to arise without operates, the self-organizing capacity of all 1. van Duinen, R. J. European research councils and the Triple
being pinpointed exactly, making it necessary participants needs to be enhanced. Here, Helix. Sci. Public Policy 25, 381–386 (1998).
2. Daston, L. & Galison, P. The image of objectivity. Representations
to ‘prospect’ for these (presently unknow­ there is tension between the desire for 40, 81–128 (1992).
able) implications. Such a process might, for individual or institutional autonomy and 3. Ziman, J. Reliable Knowledge Canto Edn (Cambridge Univ. Press,
example, involve consulting other knowledge the increasing demands for accountability Cambridge, 1991).
4. Nowotny, H., Scott, P. & Gibbons, M. Re-thinking Science:
producers and users, as well as wider social on both individuals and institutions.
Knowledge Production in an Age of Uncertainty (in the press).
constituencies, in order to carry out a form of Increasing the capacity for self-organiz­ 5. Gieryn, T. F. in Handbook of Science and Technology Studies (eds
‘triangulation’ survey. Perhaps every research ing means that participants need to act more Jasanoff, S. et al.) 393–443 (Sage Publishers, London, 1995).
proposal and project should include a reflexively. But one cannot enhance respon­ 6. Hughes, T. P. Rescuing Prometheus 301–303 (Pantheon Books,
New York, 1998).
deliberate strategy for identifying its ‘context siveness by simply increasing the demand
of implication’. This might best be achieved for public accountability, as this could Acknowledgements. The research described in this essay is taken
from a report entitled ‘Re-thinking science: knowledge production in
by including those likely to be implicated — make participants more, rather than less, a Mode 2 Society’, prepared by H. Nowotny, P. Scott and M.
perhaps unknowingly — as well as the defensive. On the contrary, what is needed is Gibbons for, and funded by, the Tercentenary Fund of the Royal
conscious carriers of social knowledge. to encourage participants voluntarily to Swedish Bank and the Swedish Council for Higher Education.

C84 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

How common
are habitable planets?
Jack J. Lissauer

The Earth is teeming with life, which occupies a diverse array of


environments; other bodies in our Solar System offer fewer, if any, niches
that are habitable by life as we know it. Nonetheless, astronomical studies
suggest that many habitable planets may be present within our Galaxy.

S
‘main-sequence phase’, hydrogen in its core

O
ne of the most basic questions that
has been pondered by Natural ince one of the is gradually ‘burned up’ to maintain suffi­
Philosophers for many millennia cient pressure to balance gravity. The star’s
concerns humanity’s place in the Universe: most wondrous luminosity (energy output) grows slowly
are we alone? This question has been during this phase, as fusion increases the
approached from many different view­ and noble questions in mean particle mass in the core and greater
points, and similar reasoning has led to temperature is required to achieve pressure
widely diverse answers. Copernicus, Kepler,
Nature is whether there balance. Once the hydrogen in the core is
Galileo and Newton each demonstrated is one world or many, a used up, the star’s structure and luminosity
convincingly that other planets that were change much more rapidly. Both Sun-like
qualitatively similar to Earth orbit the Sun. question that the stars and larger stars expand and become ‘red
In the past few years, more than 20 planets giants’; those stars with an initial mass
have been discovered in orbit about stars human mind desires to greater than about eight solar masses can end
other than our Sun; these are ‘extrasolar’ understand, it seems their lives in spectacular supernova explo­
planets. sions.
The intellectual and technological desirable for us to What stars are most likely to have habit­
advances of the past century leave us poised able planets? To make the astronomical
at the turn of the millennium to investigate inquire about it. problem more straightforward, I assume
the possibility of extraterrestrial life along Albertus Magnus, 13th century that the single factor required for support­
numerous paths of experimental, observa­ ing life on a planet is the presence of water on
tional and theoretical studies in both the its surface over a long timescale. The main­
physical and life sciences. — on a wide range of timescales — between sequence phase of low-mass stars (such as
the atmosphere, the oceans, living our Sun) provides ‘continuously habitable
Prerequisites for habitability organisms, fossil fuels and carbonate rocks. zones’; when in orbits within these zones,
Here I assume that extraterrestrial life would The carbonate rocks form the largest planets may maintain liquid water on their
be carbon-based and use liquid water (char­ reservoir, and are produced by reactions surfaces for billions of years. High-mass
acteristics common to all life found on involving water (and in some cases living stars are much hotter than low-mass stars
Earth), and I define a ‘habitable planet’ as organisms). Carbon dioxide is recycled and use up their fuel far more rapidly. Thus,
one capable of supporting such life. from carbonates back into the atmosphere even if Earth-like planets form around high­
Life on Earth has been able to evolve and as tectonic plates descend into the Earth’s mass stars at distances where liquid water is
thrive thanks to billions of years of benign mantle and are heated. Carbonates are stable, it is unlikely that benign conditions
climate. Mars seems to have had a climate not readily recycled on a geologically inac­ exist for long enough on these planets to
sufficiently mild for liquid water to have tive planet such as Mars, and they are not enable life to form and evolve. However, the
flowed on its surface when the Solar System formed on planets like Venus, which lack greater flux of ultraviolet radiation may
was roughly one-tenth its current age, but at surface water. Larger planets of a given speed up biological evolution enough to
present, its low atmospheric pressure means composition remain geologically active compensate for the shorter lifetime of a
that liquid water is not stable on the martian for longer, as they have smaller ratios of moderately massive star. At the other end of
surface. Venus is too hot, with a massive surface area to mass, and thus retain heat the size spectrum, the smallest, faintest stars
atmosphere dominated by carbon dioxide; from accretion and radioactive decay for can live for trillions of years, but they emit
we cannot say whether or not young Venus longer. The number of variables involved in almost all of their luminosity at infrared
had a mild Earth-like climate. Indeed, determining a planet’s habitability pre­ wavelengths and their luminosity varies by
as models of stellar evolution predict that cludes a complete discussion, but some of tens of per cent owing to flares and large
the young Sun was about 25 per cent less the main issues are summarized in Fig. 1 ‘starspots’ (analogous to sunspots). In addi­
luminous than at present, we do not under­ (compare ref. 1). tion, habitable-zone planets orbit so close to
stand why Earth, much less Mars, was warm these faint stars that their rotation is tidally
enough to be covered by liquid oceans 4 Stellar properties and habitability synchronized (as the Moon’s rotation is
billion years ago, when life is thought to have Stars are huge balls of plasma that radiate relative to Earth); thus no day–night
originated. energy from their surfaces and liberate ener­ cycle occurs, and if the planet’s atmosphere
Carbon dioxide is important for carbon­ gy through thermonuclear fusion reactions is thin it would freeze on the perpetually
based life. On Earth, this compound cycles in their interiors. During a star’s long-lived dark, cold hemisphere2.
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C11
impacts

Planet formation
NASA/CORBIS/OAR/NURP

Stars are observed to be forming within cold


�< �� �= �� �> �� regions of our Galaxy called molecular
clouds. Even a very slowly rotating molec­
ular-cloud core of stellar mass has far too
much angular momentum to collapse down
to an object of stellar dimensions. This leads
to a (proto)star surrounded by a rotating
disk of material; a significant fraction of the
material in a collapsing core falls onto this
disk. Such a disk has the same initial elemen­
tal composition as the growing star. At
sufficient distances from the central star, it is
cool enough for about 1–2 per cent of
this material to be in solid form, either
remnant interstellar grains or condensates
formed within the disk. The growth from
micrometre-sized dust to kilometre-sized
solid bodies called planetesimals remains
poorly understood4.
Kilometre-sized and larger planetesimals
in protoplanetary disks travel on elliptical
orbits that are altered by mutual gravitation­
al interactions and physical collisions. These
interactions lead to accretion (and in some
cases, erosion and fragmentation) of plan­
etesimals. Gravitational encounters within a
‘swarm’ of planetesimals can produce veloci­
Figure 1 Environments on small and large Earth-like planets. Earth is depicted in the central panel of ties that exceed the ‘escape velocity’ from the
the figure. A smaller planet (left) made of the same material as Earth would be less dense, because the largest common planetesimals in the
pressure in the interior would be lower. Such a planet would have a larger ratio of surface area to swarm5, and sufficiently massive and dense
mass, so its interior would cool faster. Its lower surface gravity and more rigid crust would allow for planets far enough from the star can hurl
higher mountains and deeper valleys than are seen on Earth. Most important to life is that it would material into interstellar space. Comets in
have a much smaller surface pressure as a result of four factors: a larger ratio of surface area to mass, the Oort cloud — the vast comet reservoir
lower surface gravity, more volatiles sequestered in crust as there is less crustal recycling, and more ~5,000–50,000 AU from the Sun — are
atmospheric volatiles escaping to space. Among other things, this would imply a lower surface believed to be icy planetesimals that were
temperature, because there would be less greenhouse gases in the atmosphere. Some remedial sent outwards at nearly the Solar System
measures that could improve the habitability of such a mass-deprived planet are: (1) move it closer to escape velocity, and were perturbed into
the star, so less greenhouse effect would be needed to keep the surface temperatures comfortable; (2) long-lived orbits around the Sun by close
add extra atmospheric volatiles; and (3) include a larger fraction of long-lived radioactive nuclei than stars, interstellar clouds or the tidal forces of
on Earth, to maintain crustal recycling. A larger planet (right) made of the same material as Earth the Galactic disk.
would be denser and have a hotter interior. Its higher surface gravity and less rigid crust would lead We do not at present have enough data to
to muted topography. It would have a much greater atmospheric pressure, and, unless its greenhouse determine the range of planetary systems
effect was strong enough to boil away its water, much deeper oceans, probably covering its entire that occur in nature. The gravitational pull of
surface. Some remedial measures that could improve the habitability of such a mass-gifted planet are: known extrasolar planets induce velocity
(1) move it farther from the star; and (2) include a smaller fraction of atmospheric volatiles. It is not variations in their stars much larger than
clear that more active crustal recycling would be a problem, within limits. would a planetary system like our own, and
surveys so far accomplished have not detect­
ed low-mass and long-period planets6. Our
Stability of planetary systems larger spacing between orbits, smaller eccen­ own Solar System may represent a biased
Although isolated single-planet systems are tricities and inclinations, and lower-mass sample of a different kind, because it
stable essentially indefinitely, mutual gravi­ planets are more stable3. contains a planet with conditions suitable for
tational perturbations within multiple­ One of these criteria — large spacing life to evolve to the point of being able to ask
planet systems can lead to collisions and between orbits — has important implica­ questions about other planetary systems7.
ejections from the system. To a first approxi­ tions. There is a minimum separation Unfortunately, we lack the capability to
mation the star’s gravity dominates, but required for a system of Earth-mass planets develop planet-formation models from first
planets exchange orbital energy and angular to be stable for long periods of time. This principles, or even from observations of
momentum, so that over millions or billions separation is comparable to the width of a protostellar disks, whose detailed properties
of orbits, even weak perturbations could star’s continuously habitable zone. Thus, are poorly known. But theory suggests that a
move planets out of orbits that were initially arguments based on orbital stability support great of diversity of planetary systems are
in the habitable zone. Resonances among the possibility that most stars could have one possible8.
various orbital and precession (that is, or even two planets with liquid water on their We do, however, possess planet-forma­
rotation of the spatial orientation of the orbit surfaces. But unless greenhouse effects tion models that can predict the growth
ellipse) frequencies are the main source of conspire to substantially compensate for times of giant planets (such as Jupiter).
chaos in planetary systems. There are no increasing distance from the star, larger Current models9 predict growth times that
simple criteria for determining the stability numbers of habitable planets per system are are similar to estimates of the lifetime of
of systems with many planets, but in general, unlikely. gaseous protoplanetary disks. Thus, giant
C12 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

The largest mass extinction of the past


Box 1 Interstellar eavesdropping 200 million years or so occurred 65 million
years ago, when roughly half of the genera of
The Search for ExtraTerrestrial Intelligence (SETI) is an endeavour to detect signals from alien life forms21. A multicellular organisms on Earth, including
clear detection of such a signal would probably change humanity’s world view as much as any other all of the dinosaurs, suddenly died off.
scientific discovery in history. As our society is in its technological infancy, another civilization capable of The geological record shows a layer of
communicating over interstellar distances is likely to be enormously advanced compared with our own — impact-produced minerals and iridium, an
compare our technology to that of a mere millennium ago and then extrapolate millions or billions of years element rare in the Earth’s crust but more
into the future! Thus, a dialogue with extraterrestrials could alter our society in unimaginable ways (and they abundant in primitive meteorites, deposited
could probably answer most if not all of the questions raised in this article). at the time that the dinosaurs vanished (at
The primary instrument used by SETI is the radiotelescope. Most radio waves propagate with little loss the Cretaceous/Tertiary or K/T boundary).
through the interstellar medium, and many wavelengths also easily pass through the Earth’s atmosphere. Additionally, the largest known crater on
They are easy to generate and to detect. Radio thus appears to be an excellent means of interstellar Earth dated at less than one billion years old
communication, whether data are being exchanged between a community of civilizations around different was formed at this time. Taken together,
stars or broadcast to the Galaxy in order to reach unknown societies in their technological infancy. Signals these data imply that this K/T mass extinc­
used for local purposes, such as radar and television on Earth, also escape and could be detected at great tion was caused by the impact of an asteroid
distances. or comet, about 10 km in radius, into the
The first deliberate SETI radiotelescope observations were performed by Frank Drake in 1960. Since that Yucatan peninsula10.
time, improvements in receivers, data processing capabilities and radiotelescopes have doubled the capacity
of SETI searches roughly every 8 months. Nonetheless, only a minuscule fraction of directions and A deluge of data to come
frequencies have been searched, so one should not be discouraged by the lack of success so far. The speculation about advanced civiliza­
tions on Mars that was rampant a century
ago has been quashed by better observations.
However, evidence for a wet Mars billions of
planets might not form in most protoplane­ vastly more hazardous than the small ones years ago suggests that life might have
tary disks. Although giant planets them­ (Table 1). Because of the destruction that formed on that planet, and microbes
selves are unlikely abodes for life, they may impacts may produce, impact frequency is may still be present below the surface11.
harbour habitable moons. Moreover, they an important factor in planetary habitability. Interplanetary spacecraft will soon attempt
affect both the stability of the orbits of Earth- The impact rate on the terrestrial planets of to determine whether or not life ever existed
like planets and the flux of materials striking our Solar System was orders of magnitude on Mars, and if so what were (or are) its
these planets7. Such impacts can have a larger four billion years ago than it is at characteristics12. NASA plans to investigate a
devastating effect on life — a fact that no present. In a planetary system like our own, possible subsurface ocean in Jupiter’s moon
dinosaur is likely to argue with. but with smaller planets replacing Jupiter Europa13, and the Cassini mission en route to
and Saturn, large impact fluxes could Saturn will study the properties of Titan’s
Impacts and dinosaurs continue, making planets with Earth-like atmosphere14. Although Titan’s surface is
Impacts, like earthquakes, come in various compositions and radiation fluxes hostile too cold for liquid water, this large moon
sizes, with the large ones much rarer but abodes for living organisms7. has a methane-rich atmosphere in which

NATURAL HISTORY MUSEUM, LONDON


Table 1 Impacts and life
Size Example(s) Most recent Planetary effects Effects on life
Super colossal Moon-forming event 4.45 � 109 yr ago Melts planet Drives off volatiles;

R > 2,000 km wipes out life on planet

Colossal Pluto �4.3 � 109 yr ago Melts crust Wipes out life on planet
R > 700 km 1 Ceres (borderline)
Huge 4 Vesta ~4.0 � 10 yr ago
9
Vaporizes oceans Life may survive below surface
R > 200 km (large asteroid)
Extra large Chiron 3.8 � 109 yr ago Vaporizes upper Pressure-cooks photic zone;

R > 70 km (largest active comet) 100 m of oceans may wipe out photosynthesis

Large Comet Hale–Bopp ~2 � 109 yr ago Heats atmosphere Continents cauterized


R > 30 km and surface to ~1,000 K
Medium K/T impactor 65 � 106 yr ago Fires, dust, darkness; Half of species extinct
R > 10 km 433 Eros (largest NEA) atmosphere/ocean
chemical changes;
large temperature swings
Small ~500 NEAs ~300,000 yr ago Global dusty Photosynthesis interrupted;
R > 1 km atmosphere for months individuals die but few
species extinct; civilization
threatened
Very small Tonguska event 91 yr ago Major local effects; Newspaper headlines;
R > 100 m minor hemispheric romantic sunsets increase
effects; dusty atmosphere birth rate

The effects of an impact on life depend in a qualitative way on the impact energy. The smallest space debris to hit Earth’s atmosphere is slowed to benign speeds by gas drag or
vaporized before it hits the ground. The largest impactors can melt a planet’s crust and eliminate life entirely. Strong iron impactors ranging in size from that of a car to that of a
house may hit the ground at high velocity, killing living beings in their path. The rocky bolide that exploded over Tonguska, Siberia, in 1908 was about the size of a football field; it
produced a blast wave that knocked over trees tens of kilometres away. An impactor a few kilometres in size would throw enough dust into the upper atmosphere to substantially
darken the sky for much of a growing season19; the threat of such an impactor to human food supplies has led NASA to initiate a programme to detect all near-Earth asteroids
(NEAs) larger than about 1 km. Mass extinctions (such as that at the K/T boundary; see text) result from even larger impacts, which load the atmosphere with dust and chemicals
(from vapour and pulverized matter originating in both the impactor and the crater ejecta); radiation from high-velocity ejecta re-entering the atmosphere may cause global fires.
Even larger impacts fill the atmosphere with enough hot ejecta and greenhouse gases to vaporize part or all of the planet’s oceans20. Indeed, phylogenetic evidence implies that
the last common ancestor of all life on Earth was a thermophilic prokaryote, which would have been most capable of surviving such a scalding impact. Even larger impacts would
destroy all life on the planet, although it is possible that some organisms could survive in meteoroids ejected by the impact, and subsequently re-establish themselves on the
planet (or another planet orbiting the same star) by a fortuitously gentle return after conditions on the planet had improved.

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C13
impacts
photochemical reactions create organic (and possibly smaller) planets will be detect­ regions radiate most of their energy in the
molecules. Analogous processes may have ed from the ground using this technique mid-infrared and dusty interstellar clouds
occurred within Earth’s early atmosphere. within the next decade. Even higher transmit far more scattered starlight in the
Interstellar probes for in situ exploration precision is likely to be achievable from near-infrared than in the visible range.
of extrasolar planets require substantial spacecraft observations. The ultimate limit Advances in infrared detectors, interferome­
technological advances. Even with gravity to astrometric precision is likely to be differ­ try and new large telescopes (on the ground
assists by the planets and the Sun, payloads ences between the positions of a star’s centre with adaptive optics, and in space18) will
sent with current rocket propulsion systems of mass and its centre of light, which result provide data that are vastly superior to those
would require thousands of years to from ‘starspots’ and other variations in available today.
reach the nearest stars. Thus at present, inter­ brightness.
stellar travel remains in the realm of science Prospects
fiction, but considering the advances of the Earth-sized extrasolar planets? Predictions have a miserable success rate,
past millennium, voyages over such vast If the Earth lies in or near the orbital plane of and forecasts centuries into the future tend
distances may become practical in the an extrasolar planet, that planet passes in to be particularly conservative. In part, this
coming centuries. front of the disk of its star once each orbit as reflects deficiencies in the human imagina­
Another approach to the detection of viewed from Earth. Precise photometry can tion. However, the average rate of change
habitable planets is to search for signals that reveal such transits, which can be distin­ greatly exceeds the median rate. Consider the
have been sent, intentionally or otherwise, guished from rotationally modulated implication of joining the Galactic club of
by inhabitants of those worlds. This is being ‘starspots’ and intrinsic stellar variability by civilizations a billion years more advanced
done by the Search for ExtraTerrestrial Intel­ their periodicity, and would provide the size than our own (assuming such a community
ligence (SETI) (see Box 1). and orbital period of the detected planet. exists!). Such a revelation would lead to
Although we are not yet able to reach the Scintillation in, and variability of, the Earth’s changes far more fundamental than the
stars, we are nonetheless entering a golden atmosphere limit photometric precision to invention of movable type, the industrial
age of extrasolar planetary study by means of roughly one-thousandth of a magnitude, revolution and the information age have
telescopic observations. All of the known allowing detection from the ground of brought to us within the past millennium.
extrasolar planets have been identified transits by Jupiter-sized planets but not by We still do not know whether Earth-like
indirectly, through the gravitational force Earth-sized planets15. Far greater precision is planets on which liquid water flows are
that they exert on their star, and all have been achievable above the atmosphere, with rare, are usual for solar-type stars or have
found during the 1990s. The first two planets as small as Earth likely to be intermediate abundances. Nonetheless, I
extrasolar planets to be discovered orbit a detectable16. This technique has the greatest personally believe that, as there are billions of
rapidly spinning neutron star, which emits a potential for detecting Earth analogues Sun-like stars in our Galaxy, planets with
substantial fraction of its luminosity as within the next ten years. liquid water oceans lasting for long periods
X-rays. The extrasolar planets known to Distant planets are very faint objects of time are common enough to ensure that if
orbit main-sequence stars are each more which are located near much brighter objects we are the only advanced life form in our part
massive than Saturn. They were discovered (the star or stars that they orbit), and thus of the Galaxy, biological factors are much
using the Doppler technique, which they are extremely difficult to image. Efforts more likely to be the principal limiting factor
measures changes in the radial velocity of are currently underway to image giant plan­ than are astronomical causes.
the star in response to the planet’s gravita­ ets directly from the ground using adaptive Jack J. Lissauer is at the Space Science Division, MS
tional tug. optics (which adjusts telescope mirrors to 245-3, NASA Ames Research Center, Moffett Field,
Various techniques should increase our compensate for the variability of Earth’s California 94035, USA.
knowledge of extrasolar planets in the com­ atmosphere) and coronagraphs (which e-mail: lissauer@ringside.arc.nasa.gov
ing decades. Doppler studies should contin­ block the light from the star that the planet 1. Lewis, J. S. Worlds Without End: The Exploration of Planets
ue to find giant extrasolar planets6: the cur­ orbits). NASA and the European Space Known and Unknown (Helix, Reading, Massachusetts, 1998).
2. Joshi, M. M., Haberle, R. M. & Reynolds, R. T. Icarus 129,
rent precision of this technique is sufficient Agency, ESA, are both currently designing 450–465 (1997).
to detect Jupiter-mass planets in Jupiter-like space missions for the second decade of the 3. Lissauer, J. J. Rev. Mod. Phys. 71, 835–845 (1999).
orbits, and Uranus-mass planets orbiting twenty-first century that will use interfer­ 4. Weidenschilling, S. J. & Cuzzi, J. N. in Protostars and Planets III
(eds Levy, E. H. & Lunine, J. I.) 1031–1060 (Univ. Arizona
very close to their stars. Better precision may ometry and nulling to search for Earth-like Press, Tucson, 1993).
eventually lead to identification of smaller planets around nearby stars17. Assuming 5. Safronov, V. S. Evolution of the Protoplanetary Cloud and
and more distant planets, but turbulence such planets are detected, spectroscopic Formation of the Earth and Planets (Nauka Press, Moscow,
and other variability of stellar atmospheres investigations of their atmospheres, to 1969) (in Russian); Publication TTF-677, NASA, 1972 (English
transl.).
will make detection of Earth analogues using search for gases such as oxygen, ozone 6. http://www.physics.sfsu.edu/~gmarcy/planetsearch/
the Doppler technique impractical if not and methane, will follow. Obtaining planetsearch.html
impossible. resolved images of the disks of extrasolar 7. Wetherill, G. W. Astrophys. Space Sci. 212, 23–32 (1994).

8. Lissauer, J. J. Icarus 114, 217–236 (1995).

Planets may also be detected through the Earth-like planets requires substantially bet­
9. Pollack, J. B. et al. Icarus 124, 62–85 (1996).

wobble that they induce in the motion of ter optics and much greater light-gathering 10. Chapman, C. R. & Morrison, D. Nature 367, 33–40 (1994).

their stars projected into the plane of the sky. capability than available at present, but 11. McKay, C. P. Orig. Life Evol. Biosphere 27, 263–289 (1997).

This astrometric technique is most sensitive unlike interstellar travel, such distant 12. http://mpfwww.jpl.nasa.gov/

13. http://www.jpl.nasa.gov/ice_fire//europao.htm

to massive planets orbiting stars that are resolved images should be achievable within 14. http://www.jpl.nasa.gov/cassini/

relatively close to the Earth. Because the the twenty-first century. 15. http://web99.arc.nasa.gov/~mars/vulcan/

star’s motion is detectable in two dimen­ Observations of planetary formation 16. http://www.kepler.arc.nasa.gov/

sions, a better estimate of the planet’s mass provide information on properties of planets 17. http://astro1.bnsc.rl.ac.uk:80/darwin/

18. http://ngst.gsfc.nasa.gov/

can be obtained than by using radial veloci­ as a class, and typical fluxes of material 19. Toon, O. B., Zahnle, K., Morrison, D., Turco, R. P. & Covey, C.

ties. No astrometric claim of detecting an (impactors) to which Earth-like planets are Rev. Geophys. 35, 41–78 (1997).
extrasolar planet has yet been confirmed, exposed. The infrared region of the 20. Sleep, N. H., Zahnle, K. J., Kasting, J. F. & Morowitz, H. J.
Nature 342, 139–142 (1989).
but technological advances (especially the spectrum is likely to provide the greatest 21. http://www.seti.org/
development of optical and infrared information about star and planet formation Acknowledgements. I thank J. Cuzzi, F. Drake and G. Marcy for
interferometry) suggest that Jupiter-mass in the coming decades, because planetary insightful discussions.

C14 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Genetics and
general cognitive ability
Robert Plomin

General cognitive ability (g), often referred to as ‘general intelligence’,


predicts social outcomes such as educational and occupational levels far
better than any other behavioural trait. g is one of the most heritable
behavioural traits, and genes that contribute to the heritability of g will
certainly be identified. What are the scientific and social implications of
finding genes associated with g?

D
uring the past three decades, the
behavioural sciences have emerged
from an era of strict environmental
explanations for differences in behaviour to
a more balanced view that recognizes the
importance of nature (genetics) as well as
nurture (environment). This shift occurred
first for behavioural disorders, including
rare disorders such as autism (which has an
incidence of 1 per 1,000 population), more
common disorders such as schizophrenia
(1 in 100), and very common disorders such
as reading disability (1 in 50). More recently,
it has become increasingly accepted that
genetic variation makes an important con­
tribution to differences among individuals
in the normal range of behaviour as well as
for abnormal behaviour. Moreover, many
behavioural disorders, especially common
ones, may represent variation at the
extremes of the same genetic and environ­
mental factors that are responsible for varia­ Figure 1 Quantitative trait locus (QTL) perspective on complex traits. Differences among individuals
tion within the normal range. For example, for most quantitative or complex traits such as reading ability are distributed as a normal bell-shaped
disorders such as reading disability may not curve. Multiple genes influence complex traits as probabilistic propensities rather than
be due to genetic variants that specifically predetermined programmes. Here the different genetic make-up of individuals with respect to two
influence the disorder. Rather, the hypothetical genes involved in reading ability is shown for 100 individuals (each person is
same genes that contribute to the normal represented by an oval), with five of these individuals (those on the extreme left) receiving a diagnosis
range of individual differences in reading of reading disability. The green ovals indicate that the individual has the disabling variant of one gene
ability may be responsible for reading and blue ovals denote the disabling variant of the other gene. Neither gene is necessary or sufficient
disability (Fig. 1). This view, known as the for low scores, even for individuals who have disabling variants of both genes (red ovals). This QTL
quantitative trait locus (QTL) perspective perspective suggests that genes associated with common disorders such as reading disability may
(see Box 1), has important implications for represent the quantitative extreme of the same genes that are responsible for variation throughout
the search for the genes responsible for the population.
behaviour because such genes will individu­
ally have small effects; this will make them
more difficult to find than genes that have memory — correlate substantially with each lay people often read in the popular press
major effects1. other, and general cognitive ability (g) is that the assessment of intelligence is
what these diverse measures have in circular — intelligence is what intelligence
General cognitive ability common (see Box 2). Clearly there is more to tests assess. On the contrary, g is one of the
For historical and political reasons, one cognition than g — although g explains most reliable and valid measures in the
quantitative trait in particular is highly about 40 per cent of the variance among such behavioural domain; its long-term stability
controversial. This is general cognitive tests, most of the variance of a particular test after childhood is greater than for any other
ability, which has a normal distribution in is independent of g. behavioural trait, and it predicts important
the population from a low end of mild There is a wide gap between what lay social outcomes such as educational and
mental handicap to a high end of gifted indi­ people (including scientists in other occupational levels far better than any other
viduals. Diverse measures of cognitive fields) believe about intelligence and intelli­ trait2. Although a few critics remain3, g is
abilities — such as spatial ability, verbal gence testing, and what the professional widely accepted by experts4. But it is less clear
ability, information processing speed and behavioural scientist believes. Most notably, what g is: is it due to a single general process
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com C25
impacts

ty’, vary from 40 to 80 per cent, but estimates family members for g are important in child­
based on the entire body of data make it hood, accounting for about a quarter of
about 50 per cent, indicating that genes the variance, but they are not important
account for about half of the variance in g. after adolescence. In other words, non­
When the data are sorted by age, heritability shared environmental factors that make
is found to increase from about 20 per cent in children in the same family different (such
infancy, to about 40 per cent in childhood, to as differences in parental treatment, differ­
60 per cent or greater later in life12, even for ences in school experiences and different
individuals over 80 years old13. This increase experiences with peers) provide the long­
in heritability throughout lifespan is inter­ term consequences of environmental
esting, because it is counterintuitive to the influence for g. This finding coincides with
effects of Shakespeare’s ‘slings and arrows of similar findings for other quantitative traits
outrageous fortune’ accumulating over and indicates the need to re-examine the
time. It may be that heritability increases nurture assumption16.
because individuals seek out and create Two other examples of recent genetic
environments correlated with their genetic findings about the environment are that
propensities. environmental influences may override
Most of the genetic variance for g is addi­ genetic effects in families of low socioeco­
tive, that is, the effects of the individual genes nomic status17, and that genetic factors
seem simply to add up rather than there contribute to individuals’ interactions with
being interactions between the genes. The their environment18. The former finding
additivity of most genetic effects on g may be highlights that heritability estimates are not
because there is greater assortative mating absolute but depend on the environment
(non-random mating) for g than for any in which they are measured. The latter
other behavioural trait. In other words, observation, called ‘genotype–environment
bright women are likely to mate with bright correlation’, indicates that genetic influences
Figure 2 Functional genomics includes all levels men and the outcome of this dual effect is on abilities can best be thought of as
of analysis from molecular biology to that their offspring are likely to be brighter
psychology. The higher levels of analysis can be on average than would be expected if mating
referred to as behavioural genomics in order to were at random, thus spreading out the
emphasize the importance of top-down analyses distribution of g in the population. Box 1 The source of
of pathways between genes and behaviour. The data that provide evidence for a
genetic effect on g also provide the best
genetic variation
available evidence for the importance of
such as high-level strategies called executive environmental factors that are independent Quantitative traits are those characteristics, such
function or speed of information processing, of genetics. Environment clearly is impor­ as height or ‘intelligence’, that are found as a
or does it represent a concatenation of more tant, as indicated by the steady rise in IQ continuum of values within a population rather
specific cognitive processes?5,6 scores during the past several generations, than as the discrete alternative inherited character
The concept of a genetic contribution to g which would seem too short a time to be states familiar to most people from a schoolroom
has provoked much controversy, especially explained by genetics14, and by studies in acquaintance with Mendel’s peas. They are due to
following the publication in 1994 of The Bell which children from abusive families show the combined effects of a number of different
Curve by Herrnstein and Murray7. (This gains in IQ when adopted15. genes (each of which will, of course, be inherited
book in fact scarcely touches on genetics and according to the rules of mendelian genetics), and
does not view genetic evidence as crucial to Recent findings and new directions usually also have a considerable environmental
its arguments.) The first half of the book Genetic research has moved beyond the input to the final outcome. This final outcome is
shows, like many other studies, that g is rudimentary questions of whether, and to known as the phenotype. A quantitative trait locus
related to educational and social outcomes, what extent, genetic differences are impor­ (QTL) refers to a gene that contributes to a
but the second half attempts to argue that tant in the origins of individual differences in quantitative trait. A locus is the technical name in
certain right-wing policies follow from these g. These new findings inform the scientific genetics for the position on the chromosome at
findings. However, as discussed later in this and social implications discussed later. which a gene for a particular characteristic is
article, public policy does not necessarily It is often not appreciated that genetically located. In any outbreeding population, such as
follow from scientific findings, and it would sensitive designs, such as twin and adoption humans, many genes are present in the population
be possible to argue in just the opposite studies, that recognize the importance of in a number of variant forms (technically known as
direction from Herrnstein and Murray. both nature and nurture are uniquely well alleles). By inheritance from their mother and
Nonetheless, there is considerable consensus suited to the investigation of environmental father each person carries two copies (that is, two
among scientists — even those who are not influences. Indeed, one of the most impor­ alleles, which may be different or identical) of the
geneticists — that g is substantially herita­ tant discoveries about environmental genes corresponding to most loci (those on the sex
ble6,8,9. Indeed, there are more studies influences on g has come from such chromosomes excepted). The genetic variation
addressing the genetics of g than of any other genetic research. The ‘nurture assumption’16 within the human population is due to the
human characteristic, including studies of — that the home is the most important part immense number of different combinations of
more than 8,000 parent–offspring pairs, of the child’s environment — implies that alleles possible, given the tens of thousands of
25,000 pairs of siblings, 10,000 twin pairs children growing up in the same home different loci in the human genome. These
and hundreds of adoptive families, all of should be similar to one another because different combinations give each individual human
which indicate that genetic factors they share these environmental influences. being (except identical twins) a unique genetic
contribute significantly to g10,11. When genetic resemblance is taken into make-up or genotype, even though all humans
Estimates of the size of the genetic effect, account, such shared environmental influ­ share the same set of loci.
which population geneticists call ‘heritabili­ ences that contribute to the resemblance of
C26 NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

appetites rather than aptitudes in the sense


that genetic propensities propel individuals
towards evoking, selecting and constructing
Box 2
experiences that are correlated with their General cognitive ability (g)
genetic propensities. Bright children select
(and are selected by) peers and educational General cognitive ability (g) is best captured by a technique called factor analysis in which a composite score
programmes that foster their abilities. They is created that represents what diverse measures of cognitive abilities have in common (the operational
read and think more. definition of g). That is, tests of spatial ability correlate moderately with tests of verbal ability, and memory
Work on genetic influences on intelli­ tests correlate with spatial and verbal tests, although more modestly. Factor analysis (an unrotated
gence has up to now focused on g. Much less first principal component) creates a composite that weights each test by its overall correlation with all other
is known about the genetic and environmen­ tests. This vector of weights, called factor loadings, correlates with the heritabilities of the tests, that is, high
tal origins of individual differences in specif­ g-loaded tests are the most heritable. In addition, g loadings correlate with other biological and psychological
ic cognitive abilities such as spatial ability, processes, using a technique called the ‘method of correlated vectors’, which supports the validity of g5.
verbal ability, memory and processing speed. Although g is one definition of intelligence, intelligence has so many other connotations that it has been
Specific cognitive abilities show substantial suggested that the word be avoided in scientific discussion5. Intelligence tests such as the individually
genetic influence, although less than for g19. administered Wechsler tests, which are widely used for clinical purposes, assess diverse abilities such as
To what extent do different sets of genes spatial ability (for example, making multicoloured blocks match a two-dimensional design), verbal ability (for
affect these different abilities? A technique example, vocabulary), speed of processing (for example, matching digits to symbols), memory (for example,
called multivariate genetic analysis examines memorizing a sequence of digits) and reasoning (for example, identifying what is missing in a picture). Rather
covariance among specific cognitive abilities that weighting these tests by their contribution to g, intelligence tests weight each test equally by summing
and yields a statistic called the ‘genetic corre­ them to create a composite score known as IQ, which is standardized to have a mean of 100 and a standard
lation’, which is the extent to which genetic deviation of 15. Nonetheless, IQ scores correlate highly (r ≈ 0.80) with g scores derived from factor analysis.
effects on one trait correlate with genetic
effects on another trait independently of the
heritability of the two traits. That is, these tests could investigate whether genetic apolipoprotein E gene) and reading disabili­
although individual cognitive abilities are correlations are also close to 1 between these ty (which has been linked to a region on the
moderately heritable, the genetic correla­ modular measures and general processes21. short arm of chromosome 6). Associations
tions between them could be anywhere As with many unanswered questions about of g with identified polymorphic segments of
from 0, indicating that a completely different genetics and cognitive abilities, clearer DNA on the genetic map (DNA markers)
set of genes influences each ability, to 1, answers will emerge when specific genes are have begun to be reported22. As a result of the
indicating that the same genes influence a identified. In this case, the question is progress made in mapping the human
variety of cognitive abilities. Multivariate whether genes associated with g are associat­ genome, it is now becoming feasible to carry
genetic analyses have consistently found ed with a single general cognitive process, or out genome scans using association
that genetic correlations among specific with most modular processes, or with specif­ approaches involving several thousand
cognitive abilities are very high — close to 1 ic subsets of modular processes. closely spaced DNA markers. These have the
(ref. 20). power to detect and locate the kind of genes
Finding the genes for g that are likely to contribute to g, that is,
Implications for cognition theory The most far-reaching implications for multiple genes of small effect. The initial
These results have major implications for science, and perhaps for society, will come results of a systematic genome scan of thou­
current theories of how the brain works. from identifying genes responsible for the sands of DNA markers reported several
According to one theory, the brain works in a heritability of g — not rare single-gene replicated QTL associations23. The massive
modular fashion — the various cognitive mutations that cause mental retardation, but effort needed to genotype thousands of DNA
processes are specific and independent of QTLs that contribute probabilistically to markers for large numbers of subjects
each other. Implicit in this perspective is a individual differences in the normal varia­ is daunting and replication is needed to
bottom-up reductionistic view of genetics in tion in g. eliminate false positive results. However,
which individual modules such as spatial Such loci are ‘polymorphic’; that is, there optimism about this approach has been
visualization are the targets of gene action. are at least two, and often many more, vari­ fuelled by the promise of ‘SNPs on chips’ —
However, the findings from multivariate ant forms of the gene in the population. single nucleotide polymorphisms (SNPs)
genetic analyses suggest a top-down view, in These variants originally arise by mutations formatted as microarrays of oligonucleotide
which genetic effects operate primarily on g. that change the actual coding sequence of a primers on solid substrates that can quickly
Given that the brain has evolved to learn gene, thus producing a slightly different genotype thousands of DNA markers for an
from a variety of experiences and to solve a form of the encoded protein, or that affect individual. So far, such microarrays have
variety of problems, perhaps it makes sense the regulatory parts of a gene, thus affecting been most useful in studies of gene expres­
that it would function holistically. when and where the gene is switched on and sion and there remain technical difficulties
Nevertheless, finding genetic correla­ protein produced. Both types of variation to be overcome before they can be routinely
tions near 1.0 does not prove that genetic could contribute to the heritability of g. used to genotype SNPs for large samples.
effects are limited to a single general cogni­ Presumably, genes that are active in the I have no doubt that genes associated with
tive process. Another alternative is that brain (‘expressed’ in the brain in genetic g will be identified, although how much of
specific cognitive abilities might tap many of terminology) are involved in specifying g, the genetic variance will be accounted for by
the same modular processes that are each but with 30,000 or so genes known to be individual genes is uncertain. This is because
affected by different sets of genes. This expressed in the brain this hardly helps to the magnitude of the effects of genes in
hypothesis could be tested by multivariate narrow the field. A small handful of genetic multiple gene systems is not yet known for g
genetic research on measures of modular associations with behaviour have been or for any other trait or disorder controlled
processes, for example determining which found so far. The first definite associations of by a number of different genes (also known
areas of the brain are active in response to a QTLs with behaviour have emerged in the as complex traits and polygenic disorders). It
particular task. Such procedures can now be area of cognitive disabilities, namely, is likely that the average size of effect of each
done using neuroimaging techniques, and dementia (an association with the individual gene is small for complex traits —
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com C27
impacts

perhaps individual genes on average will tendency to define functional genomics at is happening now in research on dementia
account for less than 1 per cent of the vari­ the cellular level of analysis, the phrase and its only known risk factor, the gene for
ance, with few effect sizes greater than 5 per ‘behavioural genomics’ has been proposed25 apolipoprotein E. Even if hundreds of genes
cent, and a long tail of small effects extending (see Fig. 2). Indeed, behavioural genomics contribute to the heritability of g, finding
to genes of such small effect that they may may pay off more quickly than other levels of genes associated with g will make it possible
never be detected. If genes interact with each analysis in terms of prediction, diagnosis, to investigate long-standing scientific issues
other they will be more difficult to identify therapy and intervention in relation to with much greater precision. For example,
because interactive combinations of genes behavioural disorders and normal behav­ in relation to the finding that heritability of g
would need to be found rather than individ­ ioural variation. increases during development, are there
ual genes. Fortunately, genetic effects on g The brain is clearly where bottom-up additional genes associated with g later in
seem to be largely additive. Despite the molecular levels of analysis will eventually life, or do the same genes have greater effects
formidable challenges of trying to find meet top-down behavioural analysis. Stud­ as time progresses? What are the mecha­
genes of small effect, I predict that most of ies of brain functioning, as assessed by neu­ nisms by which gene–environment interac­
the heritability of g will be accounted for roimaging for example, will foster this inte­ tions and correlations emerge? Will the same
eventually by specific genes, even if hundreds gration. For humans, the expense of neu­ genes affect different cognitive abilities and
of genes are needed to do it. roimaging restricts sample sizes and, for this modular measures of brain function?
So what are the scientific and social reason, differences between individuals are In terms of treatment-related research,
implications of finding genes that influence rarely considered. Mouse models will be finding genes associated with g is likely to
g? valuable, especially given current large-scale lead to gene-based diagnoses and treatment
behavioural screens of mice treated with programmes for mild mental retardation,
Implications for science mutagens. Finding genes associated with g and clarification of its overlap with learning
One of the first tasks is to localize the specific and other cognitive abilities and disabilities disabilities. Gene-based classification of dis­
DNA differences responsible for associations in humans will provide discrete windows orders may bear little resemblance to our
between DNA markers and g. Finding the through which brain pathways leading from current symptom-based diagnostic systems.
locations of the genes involved will be greatly genes to complex cognitive processes of Indeed, from a QTL perspective, common
aided by several developments. The first will learning and memory can be observed using disorders may be the quantitative extreme of
be the completion of the entire DNA animal models. Although learning and normal genetic variation rather than quali­
sequence of the human genome, which is memory are the focus of much animal tatively different. The most exciting
expected from the Human Genome Project research in cognitive neuroscience, com­ prospect is for secondary prevention — if
during the next two years; the second is monalities across cognitive processes that DNA analysis can be used to predict genetic
the recent intensive effort to identify are indicative of g have not yet been explored risk for an individual, this might offer the
hundreds of thousands of DNA variations using animal models. The main focus at pre­ hope of intervention before disorders create
among individuals, which will make it sent is the study of synaptic connections cascades of complications. The exemplar is
possible to pinpoint functional variants; and between brain neurons and how they may be phenylketonuria (PKU), a disorder that is
the third is the effort underway to map altered as an animal (or a person) learns or due to a defect in a single gene. This leads to
patterns of gene expression, which will lays down a memory. The multivariate mental retardation unless it is detected early
indicate which genes are expressed in any genetic results mentioned earlier lead to the in life and dietary intervention is used to
given brain region. hypothesis that, even at the cellular level, for ameliorate its effects on the developing
The ultimate scientific goal is not just example in connection with synaptic alter­ brain.
finding the genes but understanding how ability, most genes will have broad effects on Perhaps the greatest implication for
they function, and this is an even more cognitive functioning rather than isolated science is that the functional genomics of g
challenging task. Functional genomics, as effects on individual modules. That is, the and other complex traits will serve as an
this aspect of genetics is called, is usually same genes will affect many different brain integrating force across diverse disciplines,
discussed in terms of molecular biological structures and cellular processes. Although with DNA as the common denominator,
analyses of cell function. For example, research on synaptic mechanisms of learning opening up new scientific horizons for
studies of coordinated spatial and temporal and memory is leading the way in genetic understanding behaviour.
patterns of gene expression using DNA chips research in cognitive neuroscience26, investi­
containing detectors for thousands of genes gators have used gene knock-outs, in which a Implications for society
— like functional imaging at a cellular level gene is altered so that it is no longer At the outset, it should be emphasized that
— will make major contributions to the expressed, rather than studying naturally no policies necessarily follow from finding
study of gene function. However, other levels occurring genetic variation that might genes associated with g, because policy
of analysis are also important in understand­ underlie individual differences in learning, involves values. For example, finding genes
ing how genes work. A behavioural level of memory and cognitive abilities. It is possible, associated with g does not mean that we
analysis will also contribute to functional but not inevitable, that a gene identified as ought to put all of our resources into educat­
genomics, for example, by means of psycho­ being involved in learning and memory in ing the brightest children. Depending on
logical theories of cognitive processing6 gene knock-out experiments may also turn our values, we might worry more about chil­
and by investigating interactions and out to contribute to individual differences in dren falling off the low end of the normal dis­
correlations between individuals and their g when naturally occurring differences in the tribution in an increasingly technological
environment24. For instance, psychological gene are identified. society, and decide to devote more public
theories suggest how different components The scientific impact of finding genes resources to those in danger of being left
of information processing are related and the associated with g will not be limited to cogni­ behind. For example, all citizens need to be
role of genes in these cognitive systems can tive neuroscience — it will affect all aspects computer literate so that they will not be left
be examined. Such top-down strategies can of behavioural research. Perhaps some day on the shore while everyone else is surfing
yield just as important information as a behavioural and social scientists will rou­ the World-Wide Web. There is much room
bottom-up molecular approach in which the tinely collect DNA using cheek swabs (where for values here because these issues involve a
products of these genes are studied at a cellu­ no blood is needed) in order to investigate, or complex balancing act among the rights and
lar level of analysis. As an antidote to the at least control for, genes associated with g, as responsibilities of society, parents and
C28 NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

children. The only thing that seems will inevitably be used to make comparisons fears involve misunderstandings about how
completely clear is that nothing will be between groups. Although such compar­ genes affect complex traits like g28. Finding
gained by ignoring the issue and pretending isons will not be straightforward because the genes associated with g will not open a door
that g does not have a significant genetic causes of individual differences within to Huxley’s brave new world where babies are
component. As is always the case, advances groups are not necessarily related to the engineered to be alphas, betas and
in science create new challenges; we should causes of average differences between gammas. The balance of risks and benefits to
be alert to such possibilities so that we can groups, the societal implications of such society of DNA chips for g is not clear — each
position ourselves to maximize the gains and research need to be anticipated. To keep this of the problems identified in this section
minimize the pains of new discoveries. issue in perspective, it should be emphasized could also be viewed as a potential benefit,
Recommended reading is an analysis of that average differences in g between groups depending on one’s values. What is clear
the ethics of the genetics of g written by an are small compared with the range of is that basic science has much to gain
ethicist and a molecular geneticist27. individual differences within groups. Per­ from functional genomic studies of brain
Will a DNA chip for g make the 1997 haps people will become less preoccupied functions related to learning and memory.
science fiction film GATTACA, in which with average differences between groups We need to be cautious and to think about
individuals are selected for education and when DNA chips make it possible to focus on societal implications and ethical issues. But
employment on the basis of their DNA, come individuals. there is also much to celebrate here in terms
true? After childhood, no DNA chip can Another general concern is that knowl­ of the increased potential for understanding
predict g as well as an IQ test given to that edge about the importance of genetics might our species’ unparallelled ability to think and
individual, because IQ tests measure the change attitudes — for example, attitudes of learn.
consequences of environmental influences parents about the malleability of their chil­ Robert Plomin is at the Institute of Psychiatry,
as well as genetic ones. I think it is more likely dren’s cognitive ability. If there are parents King’s College London, DeCrespigny Park, Denmark
that educators and employers interested in g who do not recognize genetic limits to their Hill, London SE5 8AF, UK.
will continue to use IQ tests and achievement children’s ability, it might actually be useful e-mail: r.plomin@iop.kcl.ac.uk
tests to select individuals on the basis of for them to have a more realistic view, so that 1. Plomin, R., Owen, M. J. & McGuffin, P. Science 264, 1733–1739
what they can do, rather than using a DNA their children’s failures are not interpreted as (1994).
2. Gottfredson, L. S. Intelligence 24, 13-23 (1997).
chip to estimate what they could have done. simple motivational failures. Do parents 3. Gould, S. J. The Mismeasure of Man (W. W. Norton, New York,
However, such a DNA chip might be used in matter? They do indeed. And not just 1996).
education to consider how far children are because of their genes. Although the genetic 4. Carroll, J. B. Intelligence 21, 121-134 (1995).
5. Jensen, A. R. The g factor: The Science of Mental Ability (Praeger,
fulfilling their genetic potential or to research discussed earlier indicates that par­
Westport, 1998).
prescribe different training programmes. ents do not mould their children environ­ 6. Mackintosh, N. J. IQ and Human Intelligence (Oxford
DNA chips for g might be used for prena­ mentally to be similar to them in terms of g, University Press, Oxford, 1998).
tal testing, for example, in the selection of genetic research on g is mute about much 7. Herrnstein, R. J. & Murray, C. The Bell Curve: Intelligence and
Class Structure in American Life (The Free Press, New York,
embryos for in vitro fertilization. But this that parents offer their children as teachers 1994).
seems unlikely, because few embryos are and models independent of the parents’ g. 8. Brody, N. Intelligence (Academic Press, New York, 1992).
available to choose from and there are many Moreover, as Judith Harris concludes in her 9. Snyderman, M. & Rothman, S. The IQ Controversy, the Media
important genetic diseases to screen out. book, The Nurture Assumption: “We may not and Publication (Transaction, New Brunswick, NJ, 1988).
10. Bouchard, T. J. Jr & McGue, M. Science 212, 1055–1059 (1981).
What about parents who want to use DNA hold their tomorrows in our hands but we 11. Plomin, R., DeFries, J. C. McClearn, G. E. & Rutter, M.
chips for g in order to select egg or sperm surely hold their todays, and we have the Behavioral Genetics (W. H. Freeman, New York, 1997).
donors, because such a chip might provide power to make their todays very miserable.”16 12. McGue, M., Bouchard, T. J., Iacono, W. G. & Lykken, D. T. in
Nature, Nurture, and Psychology (eds Plomin, R. & McClearn,
better estimates of genetic potential than The most general fear is that finding
G. E.) 59–76 (American Psychological Association,

phenotypic tests of the donors? Is it possible genes associated with g will undermine sup­ Washington, DC, 1993).

that there are parents who would use DNA port for social programmes because it will 13. McClearn, G. E. et al. Science 276, 1560–1563 (1997).
chips for g prenatally for eugenic purposes? legitimate social inequality as ‘natural.’ The 14. Flynn, J. Am. Psychol. 54, 5–20 (1999).
15. Duyme, M., Dumaret, A.-C. & Tomkiewicz, S. Proc. Natl Acad.
Will DNA chips for g be used for postnatal unwelcome truth is that equal opportunity Sci. USA 96, 8790–8794 (1999).
screening to enable interventions that avoid will not produce equality of outcome 16. Harris, J. R. The Nurture Assumption: Why Children Turn Out
risks or enhance strengths? For decades we because people differ in g in part for genetic the Way They Do (The Free Press, New York, 1998).
have screened newborns for PKU because a reasons. When the US founding fathers 17. Rowe, D. C. & van den Oord, E. J. C. G. Child Dev. (in the
press).
relatively simple dietary intervention exists declared that all men are created equal they 18. Plomin, R. Genetics and Experience: The Interplay Between
that prevents its damage to the developing did not mean that all people are identical, but Nature and Nurture (Sage Publications, Thousand Oaks, CA,
brain. If similarly low-tech and inexpensive rather that they should be equal before the 1994).
19. Plomin, R. & DeFries, J. C. Sci. Am. May, 62–69 (1998).
interventions such as dietary changes could law. Democracy is needed to ensure that all
20. Petrill, S. A. Curr. Directions Psychol. Sci. 6, 96–99 (1997).
make a difference for some g genotypes, people are treated equally despite their 21. Kosslyn, S. & Plomin, R. in Psychiatric Neuroimaging Strategies:
parents might want to take advantage of differences. On the other hand, finding Research and Clinical Applications (eds Dougherty, D., Rauch, S.
them even if the QTL accounts for only heritability or even specific genes associated L. & Rosenbaum, J. F.) (American Psychiatric Press,

Washington, DC, in the press).

a small amount of variance. Expensive with g does not imply that g is immutable. 22. Chorney, M. J. et al. Psych. Sci. 9, 1–8 (1998).
high-tech genetic engineering in regard to Indeed, genetic research provides the best 23. Fisher, P. J. et al. Hum. Mol. Genet. 8, 915–922 (1999).
behavioural traits is unlikely to happen for a available evidence that non-genetic factors 24. Plomin, R. & Rutter, M. Child Dev. 69, 1221–1240 (1998).
long time — it is proving very difficult even are important in the development of individ­ 25. Plomin, R. & Crabbe, J. C. Psychol. Bull. (in the press).
26. Migaud, M. et al. Nature 396, 433–439 (1998).
for the single-gene disorders, and it will be ual differences in g. PKU provides an 27. Newson, A. & Williamson, R. Bioethics 13, 327–342 (1999).
many orders of magnitude more difficult example that even a single gene that causes 28. Rutter, M. & Plomin, R. Br. J. Psychiat. 171, 209–219 (1997).
and less effective for complex traits mental retardation can be ameliorated
Acknowledgements. Preparation of this paper and the QTL
influenced by many genes. environmentally. research on g is supported by a grant from the US National
A more general concern involves group ‘There is no gene for the human spirit’ is Institute of Child Health and Human Development. The paper
differences, such as average differences the subtitle of the film GATTACA. It embod­ profited from review of an earlier version by R. Arden, S. Baron-
between classes and ethnic groups. As genes ies the fear lurking in the shadows that Cohen, I. Craig, P. Dale, I. J. Deary, J. C. DeFries, L. S. Gottfredson,
F. Happé, J. Rich Harris, C. Hughes, D. Lubinski, P. McGuffin, A.
are found that are associated with differences finding genes associated with g will limit our Newson, T. G. O’Connor, M. J. Owen, S. A. Petrill, K. J. Saudino, L.
among individuals within groups, the genes freedom and our free will. In large part such A. Thompson and I. D. Waldman.

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com C29


foreword

Tales of the expected


Philip Campbell

Forecasting the future in science is fun but often hopelessly misleading. This
publication, commissioned by all the Nature journals, focuses on future
developments about which we can be reasonably confident and which will
have an impact on the lives of all of us.

A
s the immunologist Peter Medawar

OLIVER BURSTON
once wrote, wise people may develop
expectations about the future, but
only the foolish make predictions. X-rays,
radioactivity and high-temperature super­
conductors are just three ways in which
physicists have taken themselves and other
scientists by delighted surprise. Medawar
cited the uncovering of genetic polymor­
phisms within the human leukocyte antigen
system as an unanticipatable simultaneous
solution to several biomedical conundrums.
The unexpected detection by James Lovelock
in the 1960s of chlorofluorocarbons in the
marine atmosphere stimulated a transfor­
mation in the understanding of the stratos­
pheric ozone layer. And so on.
The unpredictability of experimental
progress was neatly summarized in 1926 by J.
B. S. Haldane: “In the case of both thunder­
storms and fever the clue came from measur­
ing the lengths of mercury columns in glass
tubes, but what prophet could have predict­ tron. These were subsequently detected when messy as it is, such clear-cut predictions —
ed this?” Nevertheless, there has always been positrons, the first species of anti-matter, representing scientists at their best, out on an
the uneasy feeling that governments and leg­ were discovered in cosmic rays. Wolfgang intellectual limb — are so rare.
islatures, focused ever more on obtaining Pauli’s physical reasoning led him to con­ The more common and foolish predic­
benefits from science, expect too much of clude with confidence that there must be a tions about science are usually much more
researchers and of research as a whole in the very weakly interactive particle and (nearly?) casual and broad ranging, often based on the
way of premeditated discovery. That is pre­ massless particle involved in nuclear decay — blinkered preconceptions of even great scien­
sumably why learned societies find it neces­ subsequently identified in the form of the tists — see, for example, Rutherford’s notori­
sary to highlight in public what is well known neutrino. Currently the world awaits the ous dismissal of the possibility of nuclear
to the cognoscenti: the unanticipatable con­ detection of the predicted Higgs boson and energy. Looking back at predictions 100 years
junctions of unrelated developments that — less certainly — ‘supersymmetric’ part­ ago, described by John Heilbron and Bill
have led, time and again, to high peaks of sci­ ners of all known fundamental particles. Bynum on page C86, is both entertaining and
entific and technological progress. The US (Failure to detect a Higgs particle — a key­ sobering. Nature, hopefully wisely, is straying
National Academy of Sciences has gone so far stone of the currently unshakeable standard into predictions only by means of fictions,
as to generate a series of case studies in its model of high-energy physics — at CERN’s appearing in our weekly ‘Futures’ essays. In
‘Beyond Discovery’ education programme Large Hadron Collider early next decade is contrast, this supplement, Impacts of foresee­
(see www4.nas.edu). highly improbable but, paradoxically, could able science, commissioned with the help of
Then again, some of the great achieve­ be an unpredicted boost for particle physics.) the staff of Nature and of the six Nature
ments of science have flowed from predic­ monthly journals, is about expectations of
tions. (Or were they expectations only classed Scarcity of wise prediction what can and might be achieved.
as predictions with hindsight?) When an Such unambiguous predictions, far from The articles published here are not
original way of thinking about a problem foolish, come from within a tightly argued, intended to represent some collective vision,
leads to a mathematical equation with solu­ self-contained theoretical framework based they have not been peer reviewed, nor are
tions that represent what have hitherto been on critical assumptions or hypotheses that they intended to provide balanced surveys.
mysterious phenomena, that may be a tri­ may prove false. They stand out among the On the contrary: they have been commis­
umph. But excitement runs even higher peaks of science, alongside the unexpected sioned as representing the scientific hopes
when the same equation yields solutions that discoveries, and those that were anticipated and expectations of these particular authors,
match nothing so far seen. Are they math­ in principle but still revelatory in actuality, drawn from among many scientists and pro­
ematical oddities? The negative-energy solu­ such as the structure of DNA. And not only fessional colleagues who combine a broad
tions in Dirac’s equation, which gives a quan­ in physics: witness the analysis that followed overview with an individual outlook. For
tum mechanical description of an elemen­ Lovelock’s work that led atmospheric extra fun, we have interspersed the longer
tary particle, led him to predict particles of chemists to predict damage to the ozone articles with short items about anticipated
equal mass and opposite charge to the elec- layer. It is a shame that, the world being as research relating to a variety of timescales at
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C7
foreword
which natural phenomena are known to in knowledge, the impact of which we try to biology can most favourably be seen as part
occur, from 10–43 to (at least) 1017 seconds. anticipate in some depth. The genes that of the wider picture of future improvement
There is a strong bias in our choice of topics. make up representative genomes across the in human health. And, as Barry Bloom
In the interests of our readership across all various kingdoms of life are being identified. describes on page C63, such advances are
disciplines, we have focused on some of the The technology is available for mass analyses themselves part of a much bigger picture of
fundamental science that will directly or of gene function and patterns of expression, development and of debate relating to public
indirectly touch everybody’s lives. while the ability to catalogue anybody’s health, its priorities and economics.
genome in a day is now anticipatable, though Public ambivalence towards such
Everyday stuff still lies many years away. Within years we advances, and more recently towards
In that context, the physics of everyday mat­ will also have built up libraries of protein undoubtedly positive developments in
ter merits a place because, despite such mat­ structure and function. But the geneticist nutrition (see Gordon Conway and Gary
ter’s familiarity, fundamental aspects of its Eric Lander has characterized molecular and Toenniessen, page C55), are nothing new. As
behaviour, especially as described by statisti­ cell biologists as still only approaching a the journalist and sociologist of science Jon
cal physics, still provide researchers with phase of development of their discipline Turney has described in his book Franken­
some of their most difficult intellectual chal­ reached by chemists 100 years ago, when the stein’s Footsteps: Science, Genetics, and Popu­
lenges. Condensed matter physics has already periodic table was first described. lar Culture, the framework for social discus­
led to new electronic technologies that have In this supplement we have taken many of sion of contentious science — newspaper
transformed our lives, not least in develop­ those very challenging advances for granted. coverage, high-profile scientists and hostile
ments in information and materials tech­ What will follow? Developmental and evolu­ representatives of various publics — were all
nologies. These are themselves transforming tionary biologists and palaeontologists will in place at least in the United States and
the ways scientists of all disciplines will have a wonderful time, as anticipated by Britain in the 1900s, in time for debates on
accomplish computations, as described by Peter Holland on page C41. the chemical basis of life. Aldous Huxley,
Declan Butler on page C67. Who knows what But where next for the committed reduc­ Julian Huxley and J. B. S. Haldane stimulated
technologies might materialize following a tionists? Even if one knows all of those ele­ such debate in the 1930s as they anticipated
deeper understanding of common and not­ ments of an organism, and their individual much of the developments that have
so-common transitions between the various functions, what could we say about the occurred since — and, in Brave New World,
phases of matter? Of all the developments dis­ nature and behaviour of the organism? The rather more. The world’s first test-tube baby,
cussed in this publication, those are perhaps answer is: not much, even if the organism Louise Brown, stimulated worldwide agita­
the most difficult to anticipate in any detail. consists of a single cell, let alone the 1013 cells tion, and no-one reading this will need to be
Philip Ball’s article on page C73 thus concen­ and 200 or so cell types that make up one of reminded of Dolly the sheep — a mammal
trates on the essential agenda of the subject. Homo sapiens sapiens. This level of analysis whose unprecedented origin led to new levels
Rather more specific is the expectation takes us to the next step up the ladder whose of public debate, at all levels of informedness.
that more planets orbiting stars other than bottom rung is molecular biology. (The top
the Sun will be discovered. The image of the rung is the science of xenobiology, involving Science’s contract
rising Earth taken by astronauts orbiting the the study of alien life forms. So far we know In the face of such controversies, there is a lot
Moon had an impact on many people’s views only one form of life, based on DNA. At this to be said for an enhanced degree of interac­
of themselves as members of one population point I will be deliberately foolish and pre­ tion, genuinely two-way, between science
living in perishable but also cherishable con­ dict the discovery of at least one other form and society. Experience suggests that lack of
ditions. Think, then, of the impact of the of natural life — not necessarily intelligent scientific knowledge by the various publics
image of an Earth-like planet, light-years — by the end of the next century. But that is a and stakeholders does not hamper fruitful
away, being delivered from a space-based net­ digression.) discussion anything like as much as the
work of interferometers. As Jack Lissauer Biologists and others have begun to think reluctance of scientists and technologists to
describes (page C11), such precision is still about systems and networks as such — for become involved in the sometimes heated,
decades away, but there are likely in the mean­ example, of neurons and of signalling molec­ often chaotic and occasionally even physi­
time to be other tantalizing planets aplenty. ules — but we, or at least Leland Hartwell et cally threatening debates.
In highlighting our fragility, the Apollo al., on page C47, can anticipate such systems Michael Gibbons is someone who has
image was salutary. But we do not understand analysis of functional, albeit labile, collec­ long thought about what is increasingly
just how fragile humanity might be, nor what tions of interlinked molecules — or ‘mod­ becoming a commonplace idea: a new con­
we might do to reduce that fragility. Success at ules’ — becoming a new norm of biological tract between science and society (see page
modelling the regional impacts of climate investigation, crucially aided by the other C81). His agenda is not idealistic, and is
change, for example, still eludes us, let alone traditional disciplines and also by computa­ wholly practicable, although his definition
societal impacts. We are only now at the stage tional simulation. of socially robust knowledge may raise some
of developing integrated models incorporat­ But even ahead of such analysis and eyebrows. Nevertheless, various types of
ing oceans and atmospheres, with necessarily understanding, biologists will discover gross public participation have been undertaken,
crude representations of uncertain feedbacks characteristics that can be indirectly linked for example in the United States, Scandi­
involving biota and clouds. Perhaps the most to our molecular constitution. None is likely navia, Britain and France. It is surely time to
insightful progress will come in a simulatory to be more sensitive, in a societal context, draw together these experiences and for gov­
framework of the sort envisaged by John than analysis of cognitive ability and of deci­ ernments to build on them, if they are to
Schellnhuber on page C19. That will still sion making. Robert Plomin (page C25) and have a hope of bridging the gap between the
leave plenty of scope for debate over the James Nichols and William Newsome (page public’s imaginations and the ever swifter
application of the precautionary principle as C35) have their respective views of the power and, for some, dismayingly unpredictable
a recipe for societal stagnation or for sensible of the sciences to help us understand some of development of science.
planetary management. the innate aspects of ability and behaviour. Philip Campbell is the Editor of Nature and
These and other fruits of molecular and Editor in Chief of Nature publications. For a
Biological revolutions organismal biology are readily anticipatable, guide to Nature and the monthly journals, and
As is now well understood, biologists are at as are some of the controversies that will the relationship between them, see
the threshold of an immense collective leap ensue. But such applications of fundamental http://www.nature.com/author/natureguide.html.

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C9
impacts

The neurobiology of cognition


M. James Nichols and William T. Newsome

Perhaps the deepest mysteries facing the natural sciences concern the
higher functions of the central nervous system. Understanding how the brain
gives rise to mental experiences looms as one of the central challenges for
science in the new millennium.

complexity the brain faces in perceiving the

I
t is astounding that cognition and
emotion — phenomena that cannot be scene, deciding on a course of action, and
duplicated in our most sophisticated com­ then executing it.
puters — arise naturally from the electrical The brain has no direct access to the rich
activity of large systems of neurons within array of objects and surfaces in the three­
the brain. Scientific investigation of these dimensional visual environment; it must
phenomena is inherently interdisciplinary, reconstruct the scene from complex, two­
drawing strength from fields as diverse as dimensional images falling on the two
neurophysiology, cognitive psychology and retinae. Consider just three of the many
computational theory. Exciting new findings problems the visual system must solve in
have emerged in recent decades concerning scene reconstruction. First, although the
the neural underpinnings of cognitive func­ retinal images are flat, accurate depth
tions such as perception, learning, memory, perception is critical for assessing the shapes
attention, decision-making, language and and sizes of the apples, and for reaching
motor planning, as well as the influence of accurately to grab the selected apple. The
emotion and motivation upon cognition. brain reconstructs the third dimension from
With very few exceptions, however, our multiple cues in the retinal image. For exam­
understanding of these phenomena remains ple, the brain can infer depth based on the
rudimentary. We can identify particular slight disparity between the images an object
locations in the brain where neuronal activi­ casts on the two retinae. Second, in order to
ty is modulated in concert with particular pick out the largest apple, the girl must accu­
external or internal stimuli. In some cases rately assess the size of each apple. Yet the size
we can even artificially manipulate neural of an apple’s image on her retina depends on
activity in a specific brain structure (using its distance from her. Her visual system takes
electrical or pharmacological techniques) Figure 1 Visual scene of apples on a picnic table. distance into account and automatically
and cause predictable changes in behaviour. Photograph by David G. Muir. estimates the true physical sizes of objects at
But we encounter substantial difficulties in different distances, a perceptual phenom­
understanding how modulations in neural enon known as size constancy. Third, the
activity at one point in the nervous system the design of intelligent machines, and will apple scene contains a kaleidoscope of con­
are actually produced by synaptic interac­ raise contentious social issues such as the tours: texture elements, colour boundaries,
tions between neural systems. Thus our freedom of each individual to choose their shadows, reflections, specks of dirt, and so
current state of knowledge is somewhat akin behaviour, and the extent to which society on. Only a fraction of these delineate the true
to looking out the window of an airplane at can reasonably demand individual responsi­ boundaries of objects, however, and the
night. We can see patches of light from cities bility for behaviour. brain must sort these out (contour extrac­
and towns scattered across the landscape, we In what follows, we address three ques­ tion) in order to identify objects accurately.
know that roads, railways and telephone tions. What sorts of cognitive phenomena do Once the girl’s visual system reconstructs
wires connect those cities, but we gain little neuroscientists seek to explain in terms of an accurate representation of the scene, a
sense of the social, political and economic brain function? What form do our explana­ higher-level decision process must evaluate
interactions within and between cities that tions take, and what technical advances are the perceptual information and select one
define a functioning society. round the corner to help us? And what are apple in particular. More specifically, her
To achieve a more sophisticated level of the personal and social implications of brain must categorize the apples (granny
understanding, investigators must develop understanding the neural underpinnings of smith, red delicious and so on), assign
new experimental techniques for studying consciousness and mental functions? appropriate affective associations (likes and
functional interactions between neurons dislikes) to them, deploy spatial attention to
and systems of neurons, and new models for Visual perception and cognition salient objects in the scene, and discriminate
understanding the behaviour of complex, The following example of visually based fine differences in colour, size and shape to
dynamic systems like the brain. Whether cognition illustrates some of the mental select the best of the granny smiths. Thus, the
major breakthroughs occur on the timescale phenomena that neuroscientists seek to girl’s decision is shaped by immediate senso­
of years or decades depends substantially on understand. Imagine a girl standing before a ry information, by previously learned
success in developing these new techniques. picnic table (Fig. 1), scrutinizing a jumble of categories drawn from visual memory, and
Irrespective of timescale, an increasingly apples in search of the biggest and best by likes and dislikes based on accumulated
sophisticated understanding of the neural granny smith, her favourite kind. She looks, experience. Other elements in the scene that
basis of cognition will influence our society finds one, reaches out and grabs it. The are totally irrelevant to the decision must be
profoundly. It will have practical applica­ action seems straightforward enough, but ignored. Finally, once the brain reconstructs
tions such as treatment of mental disease and this simple description belies the enormous the scene and makes a decision, voluntary
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C35
impacts
movement systems must plan and execute
the appropriate behavioural response a b Visual spatial Voluntary eye
(reaching for the selected apple). Benevolence Firmness
Continuity attention movements
Motor preparation
Veneration Inhabitiveness
Thus, choosing an apple from an apple Human nature
Spirit­ Hope Conscien­
Se
lf-
e
Analytic
and figural
Anticipation
Motor execution
Agreeableness tiousness Ap ste Face
ita­ uality

}
cart, like innumerable actions we carry out Comparison
Im n
tio
ive prob em
ne at
reasoning Hand Face Somatosensory cortex
Foot

}
ty
ss ­ Spatial working Hand Visual spatial

mi
every day, involves a surprising array of

ty
Causality memory Foot attention

bli
ali
ne hful­

Ide

Su
Eventuality Cautiousness Mathematical Analytic reasoning

ss
cognitive functions. Scientists would like to

t
Mir
Mathematical

ness ­
Locality approximations

Friendship
e
tivenessc-

iv
Sec approximations

Constru
Time

uisit
retiv Anticipation
understand the neural mechanisms that Weight
Tune ene
ss of pain Visual spatial

Acq

nesative­
Detru attention
Size ctive­
underlie those functions.

love
ness Primary visual

s
Colour

mb
Aliment­
ative cortex

ntal
Co
Form

ity g­
Language Order

Pare
al gu
Exact

n
Levels of understanding

Co
Calculation mathematical
Amativeness
calculations
Understanding the neural basis of a specific Olfaction Speed perception
Vitativeness Object Motion perception
cognitive function typically begins with working Pleasant
memory touch Colour perception
behavioural observations and hypotheses Olfaction
Pain
Face recognition
Speech
developed by perceptual and cognitive production Semantic priming
Auditory cortex
of visual words
psychologists. Equipped with sound Anticipation
of pain
Spoken language
comprehension
conceptual frameworks originating in
behaviour, neurophysiologists can then
study underlying brain function at several Figure 2 Localization of mental functions. a, According to nineteenth-century phrenologists, lumps
levels. We will describe three of the most on the skull revealed the locations in the brain responsible for mental functions like ‘agreeableness’
important levels: localization, representa­ in the frontal lobe and ‘cautiousness’ in the parietal lobe. b, PET and fMRI studies, which measure
tion and microcircuitry. changes in cerebral blood flow while various perceptual and cognitive tasks are performed, have
At the coarsest level, the primary issue is revealed a different organization. Imaging data assembled by C. J. Doane.
‘localization’ of function: identifying, for
example, neural systems in the brain that are
strongly active in response to visual images the one-function-one-brain-area maps of have discovered that certain neurons in the
or when spatial attention is deployed to the phrenologists. Will we ‘understand’ the temporal and frontal lobes of the brain
different regions of a visual scene. As illus­ brain when the map in Fig. 2b is completely are active during short-term memories of
trated in Fig. 2a, localization of function has filled with blobs? Obviously not; localization specific objects or places7–9, whereas other
been a dominant theme in brain science, data provide little insight into the exact neurons in the parietal lobe respond dra­
beginning in the nineteenth century when nature of the signals encoded in a given matically when the animal deploys attention
‘phrenologists’ attempted to map mental structure, the computations being per­ to one or another region of a visual scene10–12.
functions onto the brain by correlating formed and the interactions between These functional properties can change
aspects of personality and mental ability different structures. strikingly from one cortical column to the
with the sizes of bumps at different locations Greater insight into how the brain next within a given brain area (a spatial scale
on the skull. More reliable evidence for processes and represents information can be of several tens of micrometres), revealing an
localization of function emerged in the early obtained by finer-grained studies of individ­ intricate level of organization invisible to
part of the twentieth century as neurologists ual brain structures using microelectrodes. fMRI and PET.
learned to recognize mental deficits With these techniques, investigators can These discoveries provide researchers
(sometimes highly specific) that occurred measure directly the electrical activity of with a deeper ‘point of entry’ for analysis of
subsequent to damage in particular regions individual neurons (Fig. 3a), or of small, the neural systems underlying a particular
of the brain. functionally related clusters of neurons cognitive function. Transformations in the
The most commonly used techniques in called ‘cortical columns’. For example, Fig. information carried by single neurons can
modern studies of localization are positron 3b schematizes the electrical discharges be inferred by recording from successive
emission tomography (PET) and functional (black vertical tick marks) of a cortical brain areas in a particular pathway while the
magnetic resonance imaging (fMRI), which neuron that responds vigorously to a bar of animal performs a task. On the basis of such
generate most of the glossy figures in popular light sweeping downward across a viewing measurements, investigators can develop
science magazines that depict mental activi­ screen, but not to upward motion of the bar. quantitative models of how these transfor­
ty as coloured blobs on a picture of the brain. This neuron, therefore, is ‘selective’ for mations take place. Furthermore, investiga­
Both PET and fMRI measure changes in downward visual motion. Other neurons are tors can test hypotheses about the function
blood flow to specific regions of the brain selective for other directions of motion, of a particular pathway by applying electrical
while human subjects perform various specific colours, the orientation of line seg­ or pharmacological techniques to activate or
cognitive tasks1. The blood flow signal is ments, the density of visual texture and many inactivate discrete clusters of neurons. In our
assumed to reflect changes in metabolic other visual features. Their selectivity allows laboratory, for example, we have found that
demand resulting from altered levels of them to signal or represent the presence or electrically activating clusters of direction­
neural activity. Using PET and fMRI, investi­ absence of particular features in the visual selective neurons through a microelectrode
gators can study brain activity in humans at environment. This ‘representational’ level of (Fig. 3a) can induce rhesus monkeys to
the spatial scale of individual brain struc­ analysis has provided some of the major report seeing motion in the direction encod­
tures (a few millimetres) and on a timescale success stories of systems neuroscience5,6, ed by the activated neurons13. Results of this
of a few seconds. including descriptions of early stages of visu­ sort, perhaps more decisively than any
The recent avalanche of PET and fMRI al processing relevant to the problems of others, establish a causal link between the
papers has produced many insights concern­ contour extraction and depth perception activity of particular classes of neurons and
ing localization of mental functions (Fig. described in the preceding section. specific mental phenomena.
2b)2–4. Although they are substantially more Investigation at the representational level But even this level of analysis begs funda­
informative than previous findings based on is particularly powerful when carried out in mental questions about how signals are
brain damage, distillations of data like the alert animals trained to perform interesting created, encoded and transmitted by
one in Fig. 2b are disconcertingly similar to cognitive tasks. For example, researchers single neurons and assemblies of neurons.
C36 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Individual cortical neurons (like the one in of fMRI, which is limited by blood-flow completion, when everything is known
Fig. 3a) receive input signals from as many as dynamics to a scale of seconds. Neural about how the nervous system responds to
three to ten thousand other neurons. In a processing, in contrast, occurs on a scale of light and how it produces visually guided
typical experiment, however, the neuro­ milliseconds. Our best prospects for recover­ behaviour. Now imagine a colour-blind
physiologist can characterize the responses ing temporal resolution lie in combining neuroscientist in this golden era who can
of only a few neurons at the tip of the elec­ fMRI with techniques that measure electri­ give a complete account of the neural events
trode. With such a limited data set, it is diffi­ cal activity directly, such as evoked potential underlying a normal observer’s ability to
cult to determine exactly how the thousands recording14 and magnetoencephalography15. discriminate and identify colours. Now sup­
of synaptic inputs to a cell are transformed to At the level of representational analysis, pose our scientist’s colour-blindness were
create the cell’s pattern of output activity. multi-electrode recording techniques are miraculously cured and he saw green for the
This third level of analysis — the detailed enabling investigators to obtain data simul­ first time. The question is, would he learn
‘microcircuitry’ and dynamic interactions taneously from larger populations of neu­ anything new about colour perception?
that give rise to the observed activity of single rons, and may provide the most promising Most would answer ‘yes’: what he learns is
neurons — is the most vexing for systems approach for analysing dynamic interactions precisely the conscious experience of colour.
and cognitive neuroscience. The challenge between cortical areas. Consciousness is the aspect of our mental life
stems in part from the extreme complexity of Optical techniques are likely to become that we can only understand through subjec­
even a single cortical neuron, and in part increasingly important both at the represen­ tive first-person experience.
from daunting practical problems of physi­ tational level of analysis and at the local Can we study consciousness scientifical­
cal accessibility and visualization in an circuit level. Currently, CCD cameras are ly? This is controversial among researchers,
intact, behaving animal. Elegant intracellu­ capable of gathering reflected (or fluores­ but we believe that progress will occur only
lar recording techniques can be used to study cent) light from the brain surface, thereby if investigators can develop operational
interactions between neurons in simple enabling study of brain function at the level
preparations (for example, thin slices of of individual cortical columns16. Study of a b
tissue removed from the brain, or simple single neuron dynamics will increasingly rely
invertebrate nervous systems). The sequence on multi-photon microscopy, which can
of images in Fig. 3c, for example, shows isolate signals from individual cells in vivo
synaptic events within individual compart­ hundreds of microns below the cortical
ments of a single cell in the brain of the surface17. Focal stimulation of individual
common housefly. Each compartment neurons will increasingly involve optical Time
receives synaptic signals from a different release of caged neurotransmitters18. The Moving bar Neuronal responses
input neuron. This neuron is selective for rapid development of new optical probes, c
downward visual motion, just like the one in caged neuroactive compounds, and light 4s
Fig. 3b. In the fly brain, however, it is possible sources and detectors will extend current 1s
2s
to monitor activity sweeping through the resolution limits and may permit the appli­ 3s 3s
4s
individual compartments of the cell as a bar cation of optical methods to alert animals.
is swept downward through the visual field. 2s
Observations of this kind will shed light on Implications for society
how individual neurons transform the input Progress in understanding higher brain 1s

signals they receive from other neurons into functions, fuelled by technological innova­
an output signal. Even so, understanding the tion, will certainly exert a major impact on
neural microcircuitry underlying complex society in the coming century. New discover­
cognitive phenomena like decision-making ies will be important for the diagnosis and
in mammals seems a distant hope, although treatment of psychiatric and neurological
new technical advances could substantially disease, for improving human learning and Figure 3 Illustration of the representational and
influence this level of analysis. communication and for informing the microcircuit levels of analysis. a, Pyramidal
design of intelligent machines. But a scientif­ neurons in monkey visual cortex, stained by the
A look over the horizon ic understanding of the human mind and Golgi method. A tungsten microelectrode
The immediate impediments to progress in human behaviour in terms of brain function typically used for extracellular recording has
systems and cognitive neuroscience are more could also have a profound impact on how been superimposed (image from ref. 5). b,
technical than conceptual. Computational we understand ourselves and our society. Schematic of a bar stimulus (blue bar) moving
theorists and psychologists have developed Two aspects of mentality deserve special either up or down and the response of a
plausible models for many cognitive func­ consideration: conscious experience and direction-selective neuron. Black tick marks
tions; our primary problem lies in acquiring decision-making. represent the electrical discharges of the neuron.
and analysing the neural data needed to The neuron is direction selective, discharging
evaluate these models. It will be a sign of real Consciousness more vigorously for downward motion than for
progress if this situation reverses over the The relationship between conscious experi­ upward motion. c, Tangential cell in a housefly
next two decades, with conceptual issues ence and brain function is one of the great in response to a bar moving downward in the
coming to the fore. remaining mysteries of the natural world. visual field (the preferred direction for that
Technical innovation is needed at all How can three pounds of tissue that cell). The cell was filled with calcium green,
levels of analysis. At the level of localization, scientists study with microelectrodes, which fluoresces in response to influxes of
improvements in fMRI technology could microscopes and magnetic resonance calcium caused by synaptic inputs from other
permit study at the spatial resolution of the machines give rise to conscious experience? neurons. The four images represent four
cortical column, where so much functional Indeed, what exactly do we mean by successive 1-second intervals as the bar sweeps
specialization resides. This capability is ‘conscious experience’? through the visual field producing electrical
possible in principle and should emerge A classic thought experiment19,20 illus­ changes in different spatial compartments of the
gradually over the next decade. More prob­ trates the problem. Imagine a time in the cell (S. Single and A. Borst, unpublished data).
lematic, however, is the temporal resolution future when visual neuroscience has come to
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C37
impacts

definitions of conscious experience, as has perform simple discrimination tasks. By determinism underlies our hope of finding
been accomplished for other aspects of recording at successive stages of known cures for devastating psychiatric and neuro­
cognition. In a classical form of learning, for anatomical pathways linking sensory to logical diseases. If these conditions do not
example, Pavlov’s dog generated a measur­ motor areas of the cortex, investigators have have physical causes within the brain, there is
able dependent behaviour (salivation) in uncovered intriguing evidence for neural sys­ no reason to hope that they can be cured by
response to controlled manipulation of tems that represent what the monkey physical (that is, medical) interventions.
independent variables (pairing of bell with ‘decides’ about the stimulus as opposed to On the other hand, the assumption of a
food). In general, we can hope to understand what the stimulus actually is22–24. The neural meaningful degree of personal freedom is
the neural basis of any cognitive phenom­ activity recorded in such experiments can essential not only to our personal and social
enon that can be defined operationally in a actually predict the outcome of the monkey’s lives, but to science as well. Scientists require
similar manner. decision several seconds before it is revealed the freedom to evaluate data and to reject
Accomplishing this for consciousness by a behavioural response. In addition, neu­ false hypotheses. But if our mental processes
will be difficult, because conscious experi­ rophysiologists are now taking notice of a fact unfold with the physical determinism of a
ence is intrinsically subjective. A person’s that psychologists and economists have long machine, what guarantee do we have that the
verbal reports of her conscious experiences recognized: decisions are informed as much beliefs the machine generates, by the scientif­
may be sufficiently reliable to serve as objec­ by reward expectation and personal history ic method or otherwise, are true? And if the
tive observations for testing hypotheses as by the sensory evidence available at any machine generates false beliefs (the belief in
about consciousness. But the notion that given time. If, for example, an animal knows determinism, perhaps?) how would we
verbal reports (or any other overt behaviour) from past experience that choice A tends to be discard them? We are left, then, with the para­
are an adequate proxy for conscious experi­ rewarded more frequently than choice B, it dox that both perspectives seem necessary,
ences is itself not scientifically testable. We may choose A even if countervailing sensory for the community of science as well as for
must therefore acknowledge that this evidence suggests that choice B will be everyday life. Perhaps these perspectives will
assumption is extra-scientific (as is parsimo­ rewarded on the current trial. Important new be reconciled at some point in the future. For
ny, another favourite assumption of scien­ discoveries have been made concerning the present, our only obvious option is to live
tists). Even if we do accept this assumption in neural signals related to reward expectation with both, and accept the paradox as a leaven­
general, verbal reports may fall short under and relative reward potency25–27; indeed such ing dose of humility in our intellectual lives.
some circumstances. For example, if one signals have been found in the inferior pari­ M. James Nichols and William T. Newsome are at
person reacts to pain more readily than etal cortex of monkeys28, a cortical region the Howard Hughes Medical Institute and
another, is it because he has a more intense implicated in perceptual decision-making. Department of Neurobiology, Stanford University
sensation of pain, or because his threshold School of Medicine, Stanford, California 94305,
for reacting to pain is lower? Could any Limits to determinism? USA.
verbal exchange answer such questions? The next decade will certainly witness e-mail: bill@monkeybiz.stanford.edu
Although animals can be trained to perform substantial progress in understanding how 1. Ogawa, S. et al. Proc. Natl Acad. Sci. USA 89, 5951–5955 (1992).
simple behavioural tasks, these sorts of ques­ decisions are formed within the brain, but 2. Posner, M. I. & Raichle, M. E. Proc. Natl Acad. Sci. USA 95,
763–764 (1998).
tions only become more perplexing without these studies in particular raise disquieting 3. Sell, L. A. et al. Eur. J. Neurosci. 11, 1042–1048 (1999).
the benefit of language. Therefore, any questions. The crux of the problem lies in the 4. Wandell, B. A. Annu. Rev. Neurosci. 22, 145–173 (1999).
measure of consciousness in animals is likely implications of physical determinism for our 5. Hubel, D. H. Eye, Brain, and Vision (Scientific American
to be controversial, despite the common concepts of personal freedom and moral Library, New York, 1995).
6. Parker, A. J. & Newsome, W. T. Annu. Rev. Neurosci. 21,
intuition that the family dog enjoys responsibility. If the most sophisticated 227–277 (1998).
conscious experiences of one sort or another. aspects of our mental lives, from decisions 7. Tanaka, K. Curr. Opin. Neurobiol. 2, 502–505 (1992).
Ultimately, no matter how precisely we as trivial as selecting an apple to those as 8. Fuster, J. M. Memory in the Cerebral Cortex (MIT Press,
Cambridge, 1995).
manage to link verbal reports of conscious important as choosing a spouse, are deter­
9. Goldman-Rakic, P. S. Proc. Natl Acad. Sci. USA 9, 13473–13480
experience to brain activity, fundamental mined by the molecular and cellular events (1996).
mysteries are likely to persist21. Exactly how that generate electrical activity in the 10. Desimone, R. & Duncan, J. Annu. Rev. Neurosci. 18, 193–222
something as ineffable as subjective con­ nervous system, what are we to make of our (1995).
11. Maunsell, J. H. Science 270, 764–769 (1995).
sciousness can arise from macromolecules, subjective experience of freedom? If our 12. Colby, C. L. & Goldberg, M. E. Annu. Rev. Neurosci. 22,
synapses and action potentials may well sense of freedom in making moral decisions 319–349 (1999).
remain a conundrum. is illusory, can anyone reasonably be held 13. Salzman, C. D., Murasugi, C. M., Britten, K. H. & Newsome, W.
responsible for his or her actions? Can a T. J. Neurosci. 12, 2331–2355 (1992).
14. Martinez et al. Nature Neurosci. 2, 364–369 (1999).
Decision-making and determinism criminal reasonably be punished for actions 15. George, J. S. et al. J. Clin. Neurophysiol. 12, 406–431 (1995).
An exciting frontier in the study of higher over which he has no meaningful control? 16. Hubener, M., Shoham, D., Grinvald, A. & Bonhoeffer, T. J.
brain function is the attempt to understand We have no solution to this problem, but Neurosci. 17, 9270–9284 (1997).
mechanistically how decisions are formed. several perspectives seem important to bear 17. Svoboda, K., Helmchen, F., Denk, W. & Tank, D. W. Nature
Neurosci. 2, 65–73 (1999).
‘Decision processes’ are the key cognitive in mind. The most significant, perhaps, is 18. Wang, S. S. & Augustine, G. J. Neuron 15, 755–760 (1995).
links between perception and action. that both points of view — physical 19. Jackson, F. Phil. Quarterly 32, 127–136 (1982).
Perceptual processes carry out the functions determinism and personal freedom — are 20. Thompson, E. Phil. Studies 68, 321–349 (1992).
21. Nagel, T. Phil. Rev. 4, 435–450 (1974).
of scene reconstruction, contour extraction, utterly essential in humanity’s attempt to
22. Shadlen, M. N. & Newsome, W. T. Proc. Natl Acad. Sci. USA 93,
and so on, and motor processes implement understand itself. The assumption of cause­ 628–633 (1996).
the planning and execution of a behavioural and-effect determinism is fundamental not 23. Romo, R. & Salinas, E. Curr. Opin. Neurobiol. 9, 487–493
response. Intermediate levels of the system, only to science but also to everyday life. For (1999).
24. Schall, J. D. & Thompson, K. G. Annu. Rev. Neurosci. 22,
however, must evaluate the sensory evidence example, every day most of us must negotiate 241–260 (1999).
represented in early cortical areas, and streets full of busy traffic in order to survive 25. Gallistel, C. R. Cognition 50, 151–170 (1994).
‘decide’ , for example, which apple is to be the until the next day. The basic perceptual, 26. Schultz, W., Dayan, P. & Montague, P. R. Science 275,
target of a reaching movement. decision-making and motor mechanisms 1593–1599 (1997).
27. Schultz, W. J. Neurophysiol. 80, 1–27 (1998).
Neurophysiologists have now begun to within our brains must be sufficiently deter­ 28. Platt, M. L. & Glimcher, P. W. Nature 400, 233–238 (1999).
investigate the neural underpinnings of deci­ ministic that we take the appropriate action Acknowledgements. We thank M. Schnitzer for useful suggestions
sion processes in monkeys trained to every time. Furthermore, the assumption of during the preparation of the manuscript.

C38 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

‘Earth system’ analysis


and the second Copernican revolution

H. J. Schellnhuber

Optical magnification instruments once brought about the Copernican


revolution that put the Earth in its correct astrophysical context.
Sophisticated information-compression techniques including simulation
modelling are now ushering in a second ‘Copernican’ revolution. The latter
strives to understand the ‘Earth system’ as a whole and to develop, on this
cognitive basis, concepts for global environmental management.

Earth in its correct astrophysical context. about visiting neighbouring planets, scruti-

T
here are many ways of looking forward
in time. One of the most amusing (and Today, some 500 years after Copernicus, nizing stellar objects at the brink of creation,
sometimes terrifying) is the ‘forward­ Cusanus and company, our civilization sets and even tracking down extraterrestrial
view mirror’ — contemplation of the future
by reflecting on the past. If we consider the
unravelling of the mysteries of the human
body by physicians over the past three
millennia, we see much that is relevant to
unravelling the mysteries of the Earth’s
physique, or “Gaia’s body”1.
With our present-day understanding of
medical science, it seems incredible that the
Hippocratic school, which based analysis
and prognostics of the human body on the
‘composition of the humours’ of individual
patients, dominated Western medicine well
beyond the Renaissance. Great leaps in
knowledge, such as Vesalius’s anatomical
revelations published in 1542, or Harvey’s
physiological studies in 1628, were ignored
or suppressed — notably by the deans of
Paris’s ‘infallible’ medical faculty. The first
scientific treatise relating contagious
diseases to activities of microorganisms,
rather than harmful vapours, or ‘miasmas’,
did not appear until 18402.
When the Enlightenment came, its
ultimate triumph was based, literally, on
light — the ability to process radiation
received from objects of specific interest. The
invention of the operational microscope in
1608 by the Dutch spectacle-maker
Zacharias Jansen, realizing a proposal made
in 1267 by Roger Bacon, was a turning-point
in scientific history. For the first time the
human eye could transcend its natural limits
and begin to explore the wonders of the
microcosmos.
Many more wonders awaited revelation
through the processing of light — above all,
the spangled nocturnal heavens with their
billions of gigantic entities made so tiny
by distance. Once again, faint rays of light
emitted by objects had to be invigorated
through ingenious devices, in this case
telescopes. And so optical amplification Figure 1 A tale of two revolutions. a, The shock of the Enlightenment as expressed in a (probably
techniques brought about the great apocryphal26) fifteenth-century woodcut. b, ‘Earth-system’ diagnostics in the twenty-first century.
Copernican revolution, which finally put the
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C19
impacts

intelligence. This spectacular augmentation satellites are about to establish the details of a will be dominated by intelligent combina­
of the Renaissance impulse is being accom­ complete Earth reconnaissance. tions of these principles, particularly the first
panied by a crescendo of other scientific 2. The digital-mimicry principle. two. Planetary monitoring — by remote
activities which will soon culminate in a A more sophisticated, and less expensive, sensing and a worldwide net of in situ
second ‘Copernican’ revolution. This macroscope technique is simulation model­ measurement devices — will be comple­
new revolution will be in a way a reversal ling. Here, components and processes of mented and synchronized by data models to
of the first: it will enable us to look back the original Earth system are replaced by generate a continuously updated digital
on our planet to perceive one single, mathematical representatives as accurate as ‘Weltbild’.
complex, dissipative, dynamic entity, far our evolving knowledge allows. These The quasi-antithetical spirits of the first
from thermodynamic equilibrium — the formal chimaeras are then animated and second Copernican revolutions may be
‘Earth system’. It may well be nature’s electronically, to imitate the dynamic visualized by contrasting a famous ancient
sole successful attempt at building a complex of real relationships, in virtual allegory with a modern cartoon (Fig. 1).
robust geosphere–biosphere complex (the space-time. The menagerie of Earth-system The explorer featured in Fig. 1b is dressed
ecosphere) in our Galaxy, topped by a models includes tutorial, conceptual and as a doctor for two reasons. First, the contin­
life-form that is appropriately tailored ‘analogical’ specimens4. One significant uing investigation into the Earth’s physique
for explaining the existence of that complex, advantage of this macroscope is that it is in many respects reminiscent of the
and of itself. allows a multitude of potential planetary exploration of the human body during the
Such an explanation needs eventually to futures to be played out, with no more a Renaissance. Science historians looking
encompass all the pertinent processes link­ risk than a computer crash. The validation back from, say, AD 2300, will tell yet again a
ing the system’s constituents at all scales — of Earth-simulation machines remains tale of incredible delusions and triumphs.
from convection deep in the Earth’s mantle problematic, although relentless training And second, a significant impetus behind
to fluctuations at the outer limit of the with palaeorecords can teach them satisfac­ the second Copernican revolution is the
atmosphere. New instruments are necessary, tory hind-casting skills (see, for example, insight that the ecosphere’s operation may
especially macroscopes3 which reduce, refs 5–7). be being transformed qualitatively by
rather than magnify as microscopes do, 3. The ‘Lilliput’ principle. human interference. So the macroscope is a
giving Earth-system scientists an objective As a third option, there is the ‘incredible diagnostic instrument, generating evidence
distance from their specimens — no longer shrinking Earth’ idea, as enacted in the necessary for treatment10.
too close for cognitive comfort. Sonora Desert in Biosphere II (ref. 8). The This means that we are confronted
There are three distinct ways to achieve idea involves rebuilding the ecosphere in ultimately with a control problem, a
‘holistic’ perceptions of the planetary inven­ flesh, blood and rock, on a scale reduced geo-cybernetic task that can be summed up
tory, including human civilization: by many orders of magnitude. Such a in three fundamental questions11. First, what
1. The ‘bird’s-eye’ principle. nano-planet can be conveniently scrutinized kind of world do we have? Second, what kind
An obvious trick for obtaining a panoramic for operational stability or emerging of world do we want? Third, what must we
view of the Earth is to leave it, and observe self-organization processes. Despite its do to get there?
it from a distance. The race to the Moon in disastrous performance, the Biosphere-II These questions indicate the immensity
the 1960s opened up the opportunities for project provoked fresh scientific attitudes of the challenge posed by Earth-system
this particular macroscope technique, and towards life. And in fact, the free-air experi­ analysis12. I would like now to narrow our
created the now-familiar image of our blue ments9, which are currently being used gaze to a few crucial aspects of this transdis­
planet floating in the middle of a dark, cold to investigate the effect of atmospheric ciplinary adventure, using the light of the
nowhere. The trick was not exactly cheap, CO2 enrichment on agroecosystems and latest progress in pertinent science, particu­
however; NASA’s lunar ventures absorbed forests, subscribe to a similar empiricist larly that orchestrated by the International
some US$95 billion overall. Now, space philosophy. Geosphere–Biosphere Programme (IGBP;
stations, shuttles and an armada of smart Most probably, the future of macroscopes see Box 1)13.

Understanding the Earth system


Box 1 The International At the highest level of abstraction, the make­
up of the Earth system E can be represented
Geosphere–Biosphere Programme (IGBP) by the following ‘equation’:

Each day seems to bring new discoveries — of mega-fluctuations, long-distance teleconnections, feedback E = (N, H) (1)
loops or phase-transition lines in the entrails of the planetary ecosystem. This development is driven
predominantly by IGBP research and is shedding new light on past, present and future global changes. For where N = (a, b, c, ...); H = (A , S).
example, one IGBP core project involves reconstructing, in great detail, global and regional climatic history as This formula expresses the elementary
far back as 500,000 years ago27. Another major project is more concerned with the current operational mode insight that the overall system contains two
of the Earth system. It tries to solve the ‘puzzle’ of oceanic CO2 flux in a geographically explicit way. main components, namely the ecosphere N
Intermediate results indicate that CO2 is upwelling and leaving the ocean in the subarctic western Pacific and the human factor H. N consists of an
Ocean in winter, in the Persian Gulf in summer, and west of South America all year round. In contrast, alphabet of intricately linked planetary
oceanic regions, where warm currents like the Gulf Stream and the Kuroshio are being cooled, take up large sub-spheres a (atmosphere), b (biosphere),
amounts of CO2 (ref. 28). c (cryosphere; that is, all the frozen water
Complementary activities scrutinize the CO2 metabolism of the terrestrial biosphere. Recent findings of Earth), and so on. The human factor is
provide hints that the continental ecosystems may turn into a carbon source as a result of climate change at even more subtle: H embraces the ‘physical’
some time in the next century. A subtle combination of factors, including changes in fire, storms and pest sub-component A (‘anthroposphere’ as
infestations, is responsible for this discomforting prospect29. Another disturbing result was obtained in the the aggregate of all individual human
coastal zone: model-supported forecasting of the geochemical effects on coral reefs of anthropogenic CO2 lives, actions and products) and the
build-up in the atmosphere demonstrates the high vulnerability of these ecosystems. Crucial mechanisms for ‘metaphysical’ sub-component S reflecting
reef accumulation in the tropics could be weakened by 30 per cent by 205030. the emergence of a ‘global subject’. This
subject manifests itself, for instance, by
C20 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Deep-sea Solar
sediment Mapping
system
cores

Land and ice Continents and topography Insolation (Milankovitch)

Climate change

activities
Human
Atmospheric physics/dynamics
Dynamics Cloudiness Radiation

Tropospheric Precip,Tair Evaporation, Tair precip Albedo


forcing Wind stress, Ph ysical climate insolation,n(O2) heat flux, albedo dust
Albedo heat flux,
Stress, net fresh system
heat flux extent
n(O3) leads SST water Energy
Plant transpiration/photosynthesis Snow

Terrestrial surface moisture/energy balance


Sea ice Mixed layer
Open ocean Marginal seas
River runoff Excess water
Ocean dynamics Holding
Dynamics

U
capacity Hydrology
Stratosphere/mesosphere

slopes Temperature Vegetation


extremes, amount, type, n(O2),n(greenhouse gases)
Transports SST, mixed layer depth, soil moisture, GPP stress
cloudiness upwelling, circulation Soil
Excess water
Φ(H2O), Φ(S,N,..)

history

sustainable
Maximum
Vulcanism

yield
Marine biogeochemistr y Terrestrial ecosystems
Chemistr y

activities
Nutrient

Human
Production stocks Vegetation U
Particle flux Decomposers Nutrient Plant/stand

Land use
U Decomposition/storage Insoluble recycling dynamics
Φ (C,N,P) U U
U

Open ocean Marginal seas Soluble


UV, particles

U
Biogeochemical
plasmas
space
Solar/

Φ(CFC) UV, n(O2) Φ(CO2), Φ(S,NH4) cycles n(O2), pH(precip) Φ(CO2,N4O,CH4NH4)

Φ(CFC)
Φ(CO2)
Φ(N2O) Φ(O3,NOx)
Φ(CH4) U U U U

activities
Human
Troposphere Cloud processes Urban boundary layer
U

Φ(SOx,NOx)
Φ(N2O,CO)
Tropospheric chemistry
U

Foraminifera (temperature) n(O2)


Pollen (vegetation)

Deep-sea Bog/lake
sediment Ice cores
cores cores

Figure 2 A simplistic conceptual model of the planetary machinery (reconstructed on the basis of ref. 14).

adopting international protocols for climate comet particles are responsible for a “myste­ this ‘carbonate–silicate loop’ will support the
protection. rious layer of water vapour that hangs 70 well-being of photosynthetic life on Earth
For the time being, I will not try to kilometres above the tropics”; and on page for another 500 to 700 million years — but
provide additional justification for my 49 there is a discussion of the impact of no longer16.
decomposition of E, but rather focus on its starfish population dynamics on the rise But is it really Gaia who commands the
material physique, that is, the pair (N, A). in atmospheric CO2. Veils of ignorance engine room of the Earth system? Palaeo­
A useful, although highly simplistic, way are being lifted wherever we look, and much records do not provide a clear answer to
of representing this global entity is the of this information is the result of research this question. What we do know is that
famous Bretherton diagram14 as depicted supported by the IGBP. rather small and regular external perturba­
in Fig. 2. This ‘wiring diagram’ arose out Ecosphere science is therefore coming of tions, like faltering insolation, repeatedly
of a brainstorming exercise conducted by age, lending respectability to its romantic provoked the ecosphere into alternative
eminent geosphere–biosphere scholars back companion, Gaia theory, as pioneered by modes of operation; glaciations in the
in 1985. Lovelock and Margulis15. This hotly debated Quaternary period seem to be a conspicuous
Although the Bretherton caricature of N ‘geophysiological’ approach to Earth-system example of such a self-amplifying response,
(A is compressed here into a few ‘black analysis argues that the biosphere although the precise mechanisms at work are
boxes’) is obviously impromptu, static and contributes in an almost cognizant way to not yet fully understood (see, for example,
partly inconsistent, it has since been the self-regulating feedback mechanisms that ref. 17).
subject of attempts at further simplification, have kept the Earth’s surface environment The giant strides of biospheric evolution,
not sophistication. This is surprising consid­ stable and habitable for life. on the other hand, were probably enforced
ering the firework display of scientific Taken to an extreme, the Gaia approach by true catastrophes such as asteroid
findings about the ecosphere illuminating may even include the influence of biospheric impacts, mass eruptions of volcanoes and
the last decade of this century. Any recent activities on the Earth’s plate-tectonic cosmic-dust clouds passing across the Solar
issue of the popular UK science magazine processes — through modulation of thermal System. Disasters like these will continue to
New Scientist gives an indication of the and viscous gradient fields across the upper shake our planet quite frequently; for
excitement. Take the issue of 3 July 1999, for geological layers. The inverse is already example, impact events releasing energy
example: on page 17 there is speculation backed up by science: plate tectonics is a equivalent to 10 gigatons of TNT are estimat­
over the relationship between global powerful regulator of biosphere-subsistence ed to recur on average every 70,000 years18.
warming and the rotation velocity of the conditions. It recycles the fundamental Although effects such as the glaciations
Earth; on page 22 the synergies between the nutrient, carbon, between atmosphere and may still be interpreted as over-reactions to
North Atlantic Oscillation, Sahelian ocean floor over some 500,000 years. It can small disturbances — a kind of cathartic
droughts and coral-reef die-back are tenta­ also be demonstrated, taking further the geophysiological fever — the main events,
tively identified; on page 25 it is argued that ideas of Caldeira, Kasting and others, that resulting in accelerated maturation by shock
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C21
impacts

change’22. But these instruments are still far


Inaccessible too crude — a problem scientific progress
domains
A must soon remedy. Massive intellectual and
financial investments should generate
adequate (long-term, non-equilibrium,
multi-actor) modules within the next two
Accessible universe
U(P0)
decades.

Controlling the Earth system


Martian
regime
Venusian
regime
Now assume that the research community
does its job and develops a perfect hierarchy
of transdisciplinary EMICs. Who will use
the insights, the hindsights and foresights
generated by these time machines? And in
what way? These questions prompt us to
P0
revisit the human factor H in the Earth­
0 K1 K2 system equation (1). Formally, H = (A, S),
(1)
Ncrit Ecological niche
(2)
Ncrit N
where A reflects the physiological–metabol­
ic contribution of global civilization to
planetary operation. This contribution is
Figure 3 A ‘theatre world’ for representing paradigms of sustainable development. The space of all qualitatively not dissimilar to the role played
conceivable co-evolution states P = (N, A) is spanned by a ‘natural’ axis, N (representing, say, global by the sheep of the world, which reflect
mean temperature) and a civilizatory axis, A (representing, say, global gross product). Vertical lines sunlight (albedo effect), overgraze pastures
at Ncrit(1) and Ncrit(2) delimit the niche of subsistence states for humanity between an ultra-cold (soil degradation) and emit the powerful
‘Martian regime’ and an ultra-hot ‘Venusian regime’. The domain U(P0) (‘accessible universe’) greenhouse gas methane.
embraces all possible co-evolution states that can be reached from the present state P0 by some But H embraces a second sub-factor, S,
management sequence from the overall pool. U(P0) contains specific ‘catastrophe domains’ K1 and K2. which makes all the difference. This entity,
introduced as the ‘global subject’ above,
represents the collective action of humanity
treatment, indicate that Gaia faces a power­ granted the chance of correcting our ways as a self-conscious control force that has
ful antagonist. Rampino has proposed through the Montreal ozone-protection conquered our planet. The global subject is
personifying this opposition as Shiva, the protocol. Ironically, the eminent meteorolo­ real, although immaterial. One key to its
Hindu god of destruction19. gist Chapman suggested back in 1934 emergence from the physical basis is world­
About four billion years into Earth’s creating artificial ozone holes by injecting wide communication. As you read this essay,
history, a third planetary might emerged, a appropriate ozone depletors into the stratos­ you are engaging in an indirect dialogue with
challenger to these two intransigent forces: phere as a way of improving astronomy’s the author. That is insignificant, however,
human civilization. Let us stay with mytho­ ultraviolet remote-sensing accuracy21. compared with the direct global ‘polylogue’
logical imagery and call this power Global environmental change is all taking place via the Internet. Global
Prometheus. For the purposes of Earth­ around us now, and the material compo­ telecommunication will ultimately establish
system analysis, Prometheus is to be identi­ nents of the Earth system, N and A, are a cooperative system generating values,
fied with the mega-factor H in equation (1). behaving like a strongly coupled complex. preferences and decisions as crucial
There is no need to reel off a tally of To assess the crucial consequences of this commonalities of humanity online.
environmental folklore about the influence phase transition in planetary history, we The building and application of macro­
of Prometheus on the ecosphere: the ozone must, and can, do better than drawing wiring scopes will be of tremendous help to the
lesson is enough to show that humanity is diagrams of the Bretherton type. Quite global subject in finding its identity. An
indeed capable of modifying N at a strategic recently, IGBP launched a promising macro­ ever-evolving Earth-observation system23
level. There is overwhelming evidence scope-making initiative aimed at advancing will allow S to watch its own footprints on
that the stratospheric ozone shield against Earth-system models of intermediate the ecosphere, and Earth-simulation models
harmful UV-B radiation has been perforated complexity (EMICs; see Box 2). These will enable S to make collective ‘rational
accidentally by industrial by-products, in models seek to integrate the main processes choices’ on the system’s level. Finally, densely
particular by chlorofluorocarbons (CFCs). and forces — Gaia, Shiva and Prometheus — linked global institutions, as well as
The physicochemical processes causing this through effective quantitative equations. innumerable worldwide activists’ networks,
disturbing effect are intricate; they were Ideally, they would simulate all of the will help enforce resolutions of S, such as
deciphered only a few years ago and the pertinent features of the N–A complex on the those made in international environmental
elucidators awarded a Nobel prize. But system’s level, and operate fast enough to conventions. This is the emergence of a
“things could be much worse”, as one of the serve as ‘time machines’ — producing virtu­ modern ‘Leviathan’, embodying teledemoc­
laureates, Paul Crutzen, puts it: “Bromine is al vistas of the far environmental past and racy3 and putting the seventeenth-century
almost a hundred times more dangerous for future6,7. imagination of the English philosopher
ozone than chlorine. If the chemical industry It has become clear, however, that our Thomas Hobbes into the shade.
had developed organobromine compounds quantitative mimicry skills for anthropos­ The global subject will reign over the
instead of CFCs then...we would have phere processes (like industrial growth or centuries to come. One of its most responsi­
been faced with a catastrophic ozone hole transmigration dynamics) lag far behind our ble tasks will be to seek out a tolerable
everywhere and all year-round during the ecosphere-simulation capacity. The most environmental future from the infinity of
1970s — probably before atmospheric scien­ capable sub-models currently available for optional co-evolutions of N and A. In other
tists had developed the knowledge necessary representing relevant socioeconomic words, S must guarantee sustainable
to identify the problem.”20 components of the Earth system are those development. In spite of, or because of,
We have been extremely lucky in being used in ‘integrated assessments of climate its fuzziness, the notion of sustainable
C22 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

anthroposphere–ecosphere welfare function qualitatively different environmental condi­


Box 2 Intermediate­ by choosing the proper co-evolution
segment over a fixed time period. (3)
tions on a large scale. The Intergovernmental
Panel on Climate Change is now addressing
complexity modelling Pessimization — avoiding the worst, that risks like the shut-down of the ocean convey­
is, steering well clear of catastrophe domains, or belt, the destabilization of the West
When devising Earth-system models, scientists allowing for the possibility of bad Antarctic ice sheet, and the abrupt die-back
have to resist two fatal attractions. The first one is management in the future. (4) Equitization of tropical forests due to super-critical global
over-simplification, which tends to ignore even — preserving the options for future genera­ warming. The results of IGBP and related
crucial elements of planetary dynamics like the tions, not contracting the ‘accessible global research programmes will soon
episodic warming of the tropical East Pacific (‘El universe’ over time. (5) Stabilization — enable us to identify and respect
Niño’; see ref. 31). The second one is over­ bringing the N–A complex into a desirable ‘guardrails’25 for responsible planetary
sophistication, often resulting in bulky and state in co-evolution space and maintaining management.
expensive models that defy a transparent analysis it there by good management. In fact, the second Copernican revolu­
of complexity by averaging over the details So, here is the menu from which humani­ tion will be completed only if we take this
irrelevant to the issue studied. This can be ty can select its master principle, or suitable responsibility — in spite of irreducible
achieved by constructing reduced-form combinations thereof, for Earth-system cognitive deficits as once lamented by
representations of the full set of dynamic control. The formal elaboration of these Alonso X of Castile: “If the Lord Almighty
equations according to well known principles from paradigms and putting them into operation had consulted me before embarking on
scientific computing. As far as the atmosphere using, for example, EMICs, is a highly non­ the Creation, I would have recommended
module is concerned there exists, for instance, the trivial exercise. And there still remains the something simpler”14.
option to separate ‘slow’ from ‘fast’ variables32 or question “What must we do?”. It would be H. J. Schellnhuber is at the Potsdam Institute for
to apply filtering techniques to the primitive foolish to try to answer this comprehensive­ Climate Impact Research, Telegrafenberg C4, PO
equations of motion33. ly, but a few eclectic suggestions may give an Box 60 12 03, D-14412 Potsdam, Germany.
idea of the scope of the challenge. I give these e-mail: john@pik-potsdam.de
in descending order of speculativeness: 1. Volk, T. Gaia’s body (Copernicus, New York, 1998).
development has become an end-of­ Optimization. 2. Winkle, S. Geißeln der Menschheit (Artemis & Winkler,
Düsseldorf, 1997).
millennium idée fixe. Entire libraries could The present geostrategic design of the N–A 3. de Rosnay, J. Le Macroscope (Seuil, Paris, 1975); L’homme
be crammed with treatises devoted to the complex is not optimal. The industrialized Symbiotique (Seuil, Paris, 1995).
topic. countries dominate the fields of agricultural 4. Schellnhuber, H. J. & Kropp, J. Naturwissenschaften 85, 411–425
(1998).
Dissatisfied with the lack of systematics in production, technological innovation,
5. Prentice, I. C. & Webb, T. J. Biogeogr. 25, 997–1005 (1998)
the overall sustainability debate, I embarked, environmental protection, tourist services 6. Ganopolski, A., Rahmstorf, S., Petoukhov, V. & Claussen, M.
a couple of years ago, on the quest of develop­ and many other areas. This implies not Nature 391, 351–353 (1998).
ing a rigorous common formalism, extract­ only considerable inequity across the Earth’s 7. Claussen, M. et al. Geophys. Res. Lett. 26, 2037–2040 (1999).
8. Beardsley, T. Sci. Am. 273(2), 18–20 (1995)
ing the essence of all possible concepts. On population, but also unstable distortions of 9. Long, S. (ed.) Plant Cell Environ. 22 (6; spec. issue 567–755)
the basis of such a formalism, the elaboration the natural web of energetic and material (1999).
of ‘mathematical sustainable-development fluxes. A global redesign could aim at estab­ 10. Clark, W. C. Sci. Am. 261(3), 18–26 (1989).
ethics’ becomes feasible4,12. lishing a more ‘organic’ distribution of 11. Blackburn, C. (ed.) Global Change and the Human Prospect
(Sigma Xi, Research Triangle Park, 1992).
Some central results can be illustrated in a labour, where the temperate countries 12. Schellnhuber, H. J. & Wenzel, V. (eds) Earth System Analysis.
two-dimensional ‘theatre world’ where are the main producers of global food Integrating Science for Sustainability (Springer, Berlin, 1998).
sustainable development is played out as a supplies, the sub-tropical zones produce 13. Newton, P. Nature 400, 399 (1999).
14. Fisher, A. Mosaic 19, 52–59 (1988).
strategic planning exercise (Fig. 3). Two renewable energies and high technology, and
15. Lenton, T. Nature 394, 439–447 (1998).
important aspects should be emphasized. the tropical zones preserve biodiversity and 16. Franck, S. et al. Tellus B (in the press).
First, the overall co-evolution space includes offer recreation. 17. Broecker, W .S. & Peng, T. H. Greenhouse Puzzles (Eldigio,
several unpleasant domains: apocalyptic Stabilization. Palisades, 1998).
18. Lewis, J. S. Rain of Iron and Ice (Addison-Wesley, Reading,
zones as exemplified by the Martian regime Why should Prometheus not hasten to Gaia’s 1995).
(hypothetically attainable through a run­ assistance? Geoengineering proposals have 19. Rampino, M. R. in Scientists on Gaia (eds Schneider, S. H. &
away cooling-chamber process), or the become popular as a way of mitigating the Boston, P. J.) 382–390 (MIT Press, Cambridge, Massachusetts,
Venusian regime (hypothetically attainable anthropogenic aberrations of the ecosphere. 1993).
20. Crutzen, P. J. Angew. Chem. Int. Ed. Engl. 35, 1758–1777 (1996).
through a runaway greenhouse process)16, One interesting idea features iron fertiliza­ 21. Chapman, S. Q. J. R. Meteorol. Soc. 60, 127–142 (1934).
and catastrophe zones, where humanity tion of certain ocean regions to stimulate the 22. Schneider, S. H. Environ. Mod. Assess. 2, 229–249 (1997).
might subsist, but in a miserable manner. marine biological pump which draws down 23. http://www.gos.udel.edu/publications/select_pub.htm
24. Coale, K. H. et al. Nature 383, 495–501 (1996).
Second, there exists a non-trivial subspace, CO2 (ref. 24). And Russian scientists are
25. Petschel-Held, G., Schellnhuber, H. J., Bruckner, T., Toth, F. L.
the ‘accessible universe’, which consists of the currently elaborating a repair scheme for the & Hasselmann, K. Clim. Change 41, 303–331 (1999).
ensemble of all realizable future co-evolu­ ozone layer using orbital lasers. But we can 26. Robin, H. The Scientific Image from Cave to Computer (Abrams,
tion paths. These are generated by distinct also think of proactive control of natural New York, 1992).
27. Petit, J. R. et al. Nature 399, 429–436 (1999)
management options contained in the planetary variability: insights acquired 28. Takahashi, T. et al. Proc. Natl Acad. Sci. USA 94, 8292–8299
strategic pool at the disposal of the global during the present climate crisis may enable (1997).
subject. Note that ‘business as usual’, that is, humanity to suppress future glaciation 29. Walker, B., Steffen, W., Canadell, J. & Ingram, J. The Terrestrial
planless ecosphere transformation through events by judicious injection of ‘designer Biosphere and Global Change (Cambridge Univ. Press, 1999).
30. Kleypas, J. A. et al. Science 284, 118–120 (1999).
individual opportunism, is definitely one of greenhouse gases’ into the atmosphere. 31. Timmermann, A. et al. Nature 398, 694–697 (1999).
these options. Pessimization. 32. Saltzman, B. in Physically-Based Modelling and Simulation of
Five competing paradigms for sustain­ Least speculative and most essential is the Climate and Climatic Change - Part II (ed. Schlesinger, M. E.)
737–754 (Kluwer Academic Publishers, 1988).
able development can be identified. (1) Stan­ creation of a manual of minimum safety
33. Petoukhov, V. et al. Clim. Dynamics (in the press).
dardization — prescribing an explicit long­ standards for operating the Earth system.
term co-evolution corridor emanating from Human interference with the ecosphere may Acknowledgements. I thank A. Block, M. Cassel, R. B. Cathcart, W.
Cramer, H. Hoff, S. Lütkemeier, S. Rahmstorf, D. Smart, M. Stock,
P0 in Fig. 3. (2) Optimization — getting the provoke the perhaps irreversible transgres­ J. v. Braun, V. Wenzel and the entire IGBP community for
best, that is, maximizing an aggregated sion of critical thresholds, bringing about intellectual and technical support.

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C23
impacts

From molecular
to modular cell biology
Leland H. Hartwell, John J. Hopfield, Stanislas Leibler and Andrew W. Murray

Cellular functions, such as signal transmission, are carried out by ‘modules’


made up of many species of interacting molecules. Understanding how
modules work has depended on combining phenomenological analysis with
molecular studies. General principles that govern the structure and
behaviour of modules may be discovered with help from synthetic sciences
such as engineering and computer science, from stronger interactions
between experiment and theory in cell biology, and from an appreciation of
evolutionary constraints.
lthough living systems obey the laws of many components. For example, in the Having described such concepts, we need to

A physics and chemistry, the notion of


function or purpose differentiates biol­
ogy from other natural sciences. Organisms
signal transduction system in yeast that
converts the detection of a pheromone into
the act of mating, there is no single protein
explain how they arise from interactions
among components in the cell.
We argue here for the recognition of
exist to reproduce, whereas, outside religious responsible for amplifying the input signal functional ‘modules’ as a critical level of bio­
belief, rocks and stars have no purpose. provided by the pheromone molecule. logical organization. Modules are composed
Selection for function has produced the liv­ To describe biological functions, we need of many types of molecule. They have dis­
ing cell, with a unique set of properties that a vocabulary that contains concepts such as crete functions that arise from interactions
distinguish it from inanimate systems of amplification, adaptation, robustness, insu­ among their components (proteins, DNA,
interacting molecules. Cells exist far from lation, error correction and coincidence RNA and small molecules), but these func­
thermal equilibrium by harvesting energy detection. For example, to decipher how the tions cannot easily be predicted by studying
from their environment. They are composed binding of a few molecules of an attractant to the properties of the isolated components.
of thousands of different types of molecule. receptors on the surface of a bacterium can We believe that general ‘design principles’ —
They contain information for their survival make the bacterium move towards the profoundly shaped by the constraints of evo­
and reproduction, in the form of their DNA. attractant (chemotaxis) will require under­ lution — govern the structure and function
Their interactions with the environment standing how cells robustly detect and of modules. Finally, the notion of function
depend in a byzantine fashion on this infor­ amplify signals in a noisy environment. and functional properties separates biology
mation, and the information and the
machinery that interprets it are replicated by
reproducing the cell. How do these proper­ Box 1 Phenomenological analysis of action
ties emerge from the interactions between
the molecules that make up cells and how are potentials in nerve cells
they shaped by evolutionary competition
with other cells? Action potentials are large, brief, highly nonlinear
100
Much of twentieth-century biology has pulses of cell electrical potential which are central to 80
–V (mV)

been an attempt to reduce biological communication between nerve cells. Hodgkin and 60
40
phenomena to the behaviour of molecules. Huxley’s analysis of action potentials29 exemplifies
20
This approach is particularly clear in genet­ understanding through in silico reconstruction. They 0
ics, which began as an investigation into the studied the dynamical behaviour of the voltage­ 0 1 2 3 4 5 6
ms
inheritance of variation, such as differences dependent conductivity of a nerve cell membrane
100
in the colour of pea seeds and fly eyes. From for Na+ and K+ ions, and described this behaviour in 80
–V (mV)

these studies, geneticists inferred the exis­ a set of empirically based equations. At the time, 60
tence of genes and many of their properties, there was no information available about the 40
20
such as their linear arrangement along the channel proteins in nerve cell membranes that are 0
length of a chromosome. Further analysis led now known to cause these dynamical conductivities. 0 1 2 3 4 5 6
ms
to the principles that each gene controls the From (conceptually) simple experiments on these
synthesis of one protein, that DNA contains individual conductivities, Hodgkin and Huxley Modelling action potentials. The upper
genetic information, and that the genetic produced simulations that quantitatively described trace shows three membrane action
code links the sequence of DNA to the the dynamics of action potentials, showed that the potentials, responding to different strengths
structure of proteins. action potentials would propagate along an axon of stimulus, calculated by Hodgkin and
Despite the enormous success of this with constant velocity, and correctly described how Huxley, while the lower trace shows a
approach, a discrete biological function can the velocity should change with axon radius and corresponding series of experimental
only rarely be attributed to an individual other parameters. Just as explanations of recordings. (Adapted from ref. 29.)
molecule, in the sense that the main purpose hydrodynamic phenomena do not require knowledge
of haemoglobin is to transport gas molecules of the quantum chemistry of water, those who are interested in the behaviour of neural circuits need not
in the bloodstream. In contrast, most biolog­ know how the particular channel proteins give rise to the Hodgkin–Huxley equations.
ical functions arise from interactions among
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C47
impacts

Box 2 A decision-making module in bacteriophage lambda

The bacterial virus lambda can exist in two states lysogeny defined genes whose products formed part

INSTITUT PASTEUR/CNRI/SPL
inside a bacterial cell. In the lytic state, the virus of the switch and sites on the DNA at which these
replicates, producing about 100 progeny virus products bound. Sophisticated analysis of the
particles, and releases them by inducing lysis of the interactions between the mutants led to proposals
host cell. In the lysogenic state, the viral DNA is about the circuitry of the switch, and specific
integrated into the bacterial chromosome and the proposals for which DNA sites bound which
production of a single viral protein, the repressor, regulatory proteins. These proposals were verified
inhibits expression of the other viral genes. The by molecular analyses that showed that the
physiology of the host cell and other factors regulate repressor bound to DNA32 and produced a very
the probability that an infecting lambda virus will detailed description of the biochemical interactions
become a lysogen, instead of replicating and among repressor, other DNA-binding proteins and
inducing lysis30. DNA. Key predictions of models of the switch were
Elegant phenomenological experiments inferred verified by reconstructing it in genetically
the existence of bacteriophages and the existence engineered bacteria33, and by simulating its
False-colour transmission electron micrograph of lytic and lysogenic states31 well before the viruses behaviour using computer models derived from
of lambda bacteriophages (�13,500). could be seen as physical particles. The isolation of tools used to simulate the behaviour of electrical
mutants that biased the switch between lysis and circuits34.

from other natural sciences and links it to reconstruct from purified components and, nervous system — integrating information
synthetic disciplines such as computer for these, other methods have established the and resolving conflicts?
science and engineering. validity of the module. One method is to Complete understanding of a biological
transplant the module into a different type of module has depended on the ability of phe­
Is cell biology modular? cell. For example, the action potentials char­ nomenological and molecular analyses to
A functional module is, by definition, a dis­ acteristic of nerve and muscle cells have been constrain each other (see Box 2). Phenome­
crete entity whose function is separable from reconstituted by transplanting ion channels nological models have fewer variables than
those of other modules. This separation and pumps from such cells into non­ molecular descriptions, making them easier
depends on chemical isolation, which can excitable cells4. Another approach is to create to constrain with experimental data, where­
originate from spatial localization or from theoretical models of the system and verify as identifying the molecules involved makes
chemical specificity. A ribosome, the mod­ that their predictions match reality. This it possible to perturb and analyse modules in
ule that synthesizes proteins, concentrates approach was used to describe the genera­ much greater detail. Thus, the demonstra­
the reactions involved in making a polypep­ tion of action potentials long before a molec­ tion that genetic information for virulence
tide into a single particle, thus spatially ular description of membrane channels could be transferred between bacteria
isolating its function. A signal transduction existed (see Box 1). This was the first example prompted the identification of the informa­
system, on the other hand, such as those that of ‘in silico reconstitution’, which will have an tion-carrying molecule as DNA, before the
govern chemotaxis in bacteria or mating in increasingly important role in cell biology. molecular processes involved in virulence
yeast1–3, is an extended module that achieves Functional modules need not be rigid, and the structure of DNA were understood.
its isolation through the specificity of the fixed structures; a given component may The discovery that genetic information
initial binding of the chemical signal (for belong to different modules at different resided in the DNA encouraged structural
example, chemoattractant or pheromone) times. The function of a module can be studies, which then suggested how DNA
to receptor proteins, and of the interactions quantitatively regulated, or switched encodes information and transmits it from
between signalling proteins within the cell. between qualitatively different functions, by generation to generation.
Modules can be insulated from or connected chemical signals from other modules. High­ Modular structures may facilitate evolu­
to each other. Insulation allows the cell to er-level functions can be built by connecting tionary change. Embedding particular
carry out many diverse reactions without modules together. For example, the super­ functions in discrete modules allows the
cross-talk that would harm the cell, whereas module whose function is the accurate dis­ core function of a module to be robust to
connectivity allows one function to influ­ tribution of chromosomes to daughter cells change, but allows for changes in the proper­
ence another. The higher-level properties of at mitosis contains modules that assemble ties and functions of a cell (its phenotype) by
cells, such as their ability to integrate infor­ the mitotic spindle, a module that monitors altering the connections between different
mation from multiple sources, will be chromosome alignment on the spindle, and modules. If the function of a protein were to
described by the pattern of connections a cell-cycle oscillator that regulates transi­ directly affect all properties of the cell, it
among their functional modules. tions between interphase and mitosis. would be hard to change that protein,
The notion of a module is useful only if it One must also ask how a cell integrates because an improvement in one function
involves a small fraction of the cell compo­ information and instructions that come would probably be offset by impairments in
nents in accomplishing a relatively from the many different modules that moni­ others. But if the function of a protein is
autonomous function. Are modules real? tor its internal and external environment. restricted to one module, and the connec­
Several lines of evidence suggest that they Neurobiology has an analogous problem, tions of that module to other modules are
are. Some modules, such as those for protein where the central nervous system integrates through individual proteins, it will be much
synthesis, DNA replication, glycolysis, and information from different senses and dic­ easier to modify, make and prune connec­
even parts of the mitotic spindle (the cellular tates the organism’s behaviour. Does cellular tions to other modules. This idea is support­
machinery that ensures the correct distribu­ integration merely emerge from a web of ed by the analogous observation that
tion of chromosomes at cell division), have pairwise connections between different sen­ proteins that interact with many other
been successfully reconstituted in vitro. sory modules, or are there specific modules proteins, such as histones, actin and tubulin,
Others are intrinsically more difficult to that act as a cellular equivalent of the central have changed very little during evolution,
C48 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

and by theoretical arguments that proteins


are difficult to evolve once they are partici­ Box 3 From atoms to modules in computers

pating in many different interactions5.


Understanding the relatedness of mod­ Building a computer in the 1950s relied on understanding barium oxide, the material of choice for emitting
ules is useful because knowledge about one electrons from the cathodes of vacuum tubes. A vacuum-tube module could then be designed whose
member of a class can inform the study of the function was the amplification of a signal, but whose functional description had no reference to barium oxide.
others. Relatedness by descent is often These amplifiers were assembled into logical circuits, whose logical operation could be described without
apparent from the homology of chemical reference to vacuum tubes. The connecting wires in these logical circuits had insulation of many colours so
components. For instance, the mitogen­ that the circuits could be accurately hand-manufactured. Mathematical programs were then designed on the
activated protein kinase cascades that occur basis of these logic circuits. In the computer of today, there is no barium oxide or coloured wires. Instead, the
in many intracellular signalling pathways properties of silicon and silicon dioxide are of primary importance in designing an amplifier, and the transistor
define a common functional class of signal replaces the vacuum tube. Both old and new computers have logic circuits based on the same elementary
transduction modules. Modules may also be principles, but arranged rather differently as computers have become more sophisticated. Yet all this is
related by shared design or functional unimportant to most users, whose computer program runs on either machine. At one level, barium oxide and
principles, even if they are not related by coloured wires were the soul of the old machine, while at another level, they are irrelevant to understanding
descent. The pheromone-detection system the essence of how a computer functions.
of budding yeast and the chemotactic
machinery of bacteria use unrelated compo­

SCIENCE MUSEUM/SCIENCE & SOCIETY PIC LIB


nents, but both pathways achieve a sensitive
response over a wide range of pheromone or
chemoattractant concentrations by using
reactions that specifically turn off active
forms of the signalling receptors6–8.

Lessons from other sciences


We have argued that most functional proper­
ties of a module are collective properties,
arising from the properties of the underlying
components and their interactions. Collec­
tive properties have long been studied in sta­
tistical physics and share attributes that rise
above the details9. For example, the melting
of the surface of a solid can be induced in dif­
ferent ways: by changing the pressure or tem­
perature or by adding impurities. Similarly,
different organisms induce the transition An early stored-program computer (left), built around 1950, used vacuum tubes in logic circuits,
between different patterns of microtubule whereas modern computers use transistors and silicon wafers (right), but both are based on the
organization that occurs during cell division same principles.
by changing different members of the set of
kinetic parameters that govern microtubule
polymerization10,11. The concept of phase (or
state) transitions from physics may help that a cell receives from its environment can another on the basis of a set of rules. How
unify different observations and experi­ influence which genes it expresses — and might the lessons learned here apply to biol­
ments. Moreover, many molecular details thus which proteins it contains at any given ogy? Evolution selects those members of a
are simply not needed to describe phenom­ time — or even the rate of mutation of its genetically diverse population whose
ena on the desired functional level. DNA13, which could lead to changes in the descendants proliferate rapidly and survive
Biological systems are very different from molecular structures of the proteins. This is over many generations. One way of ensuring
the physical or chemical systems analysed by in contrast to physical systems where, typical­ long-term survival is to use information
statistical mechanics or hydrodynamics. ly, macroscopic perturbations or higher-level about the current environment to predict
Statistical mechanics typically deals with structures do not modify the structure of the possible future environments and generate
systems containing many copies of a few molecular components. For example, the responses that maximize the chance of
interacting components, whereas cells con­ existence of vortices in a fluid, although survival and reproduction. This process
tain from millions to a few copies of each of determined by the dynamics of molecules, is a computation, in which the inputs are
thousands of different components, each does not usually change the nature of the con­ environmental measurements, the outputs
with very specific interactions. In addition, stituents and their molecular interactions. are signals that modulate behaviour, and the
the components of physical systems are often More importantly, what really distin­ rules generate the outputs from the environ­
simple entities, whereas in biology each of guishes biology from physics are survival mental inputs. For example, signals from the
the components is often a microscopic and reproduction, and the concomitant environment entrain circadian biological
device in itself, able to transduce energy and notion of function. Therefore, in our opin­ clocks to produce responses to predicted
work far from equilibrium12. As a result, the ion, the most effective language to describe fluctuations in light intensity and tempera­
microscopic description of the biological functional modules and their interactions ture. Indeed, the history of life can be
system is inevitably more lengthy than that of will be derived from the synthetic sciences, described as the evolution of systems that
a physical system, and must remain so, unless such as computer science or engineering, in manipulate one set of symbols representing
one moves to a higher level of analysis. which function appears naturally. inputs into another set of symbols that repre­
Information flows bidirectionally The essence of computational science is sent outputs14.
between different levels of biological organi­ the capacity to engineer circuits that trans­ Just as electrical engineers design circuits
zation. For instance, the macroscopic signals form information from one form into to perform specific functions, modules have
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C49
impacts

evolved to perform biological functions. The in order to activate an output. Amplifiers are In both biological and man-made systems,
properties of a module’s components and built to minimize noise relative to signal, for reducing the frequency of failure often
molecular connections between them are instance by choosing appropriate time requires an enormous increase in the com­
analogous to the circuit diagram of an elec­ constants for the circuits. Parallel circuits plexity of circuits. Reducing the frequency at
trical device. As biologists we often try to (fail-safe systems) allow an electronic device which individual cells give rise to cancer to
deduce the circuitry of modules by listing to survive failures in one of the circuits. about 10–15 has required human cells to
their component parts and determining how Designs such as these are common in evolve multiple systems for preventing
changing the input of the module affects its biology. For example, one set of positive mutations that could generate cancer cells,
output15. This reverse engineering is feedback loops drives cells rapidly into mito­ and for killing cells that have an increased
extremely difficult. Although an electrical sis, and another makes the exit from mitosis a tendency to proliferate.
engineer could design many different rapid and irreversible event16. Negative feed­ Biological systems can both resist and
circuits that would amplify signals, he would back in bacterial chemotaxis allows the sen­ exploit random fluctuations, or noise. Thus,
find it difficult to deduce the circuit diagram sory system to detect subtle variations in an evolutionary adaptation depends on DNA
of an unknown amplifier by correlating its input signal whose absolute size can vary by being mutable, but because most mutations
outputs with its inputs. It is thus unlikely that several orders of magnitude17. Coincidence are neutral or deleterious, the rate of muta­
we can deduce the circuity or a higher-level detection lies at the heart of much of the tion is under rigorous genetic control. Many
description of a module solely from genome­ control of gene transcription in eukaryotes, systems for specifying the polarity of cells or
wide information about gene expression in which the promoters that regulate gene groups of cells rely on a mechanism known as
and physical interactions between proteins. transcription must commonly be occupied ‘lateral inhibition’, which causes adjacent
Solving this problem is likely to require by several different protein transcription cells to follow different fates. This process can
additional types of information and finding factors before a messenger RNA can be pro­ amplify a small, often stochastic, initial
general principles that govern the structure duced. Signal transduction systems would be asymmetry causing adjacent cells or adjacent
and function of modules. expected to have their characteristic rate areas within cells to follow different fates.
A number of the design principles of constants set so as to reject chance fluctua­ Other aspects of functional modules are
biological systems are familiar to engineers. tions, or noise, in the input signal. DNA less familiar to engineers. Several can be
Positive feedback loops can drive rapid tran­ replication involves a fail-safe system of error subsumed under the idea that the rules for a
sitions between two different stable states of a correction, with proofreading by the DNA module’s function are rigidly encoded in the
system, and negative feedback loops can polymerase backed up by a mismatch repair structures of its proteins, but produce messy,
maintain an output parameter within a process that removes incorrect bases after the probabilistic intermediates that are then
narrow range, despite widely fluctuating polymerase has moved on. A failure in either refined to give unique solutions. This princi­
input. Coincidence detection systems require process still allows cells to make viable proge­ ple seems to hold across an enormous
two or more events to occur simultaneously ny, but simultaneous failure of both is lethal. range of scales, from the folding of protein

C50 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

molecules to the evolution of organisms. The sequence of intermediates should allow such for understanding their operation. One has
principle arises from a combination of three modules to survive incremental modifica­ also to remember that today’s modules were
mechanisms: exploration with selection tions and incorporate evolutionary addi­ built by tinkering with already functional
(trial and error), error-correction mecha­ tions such as error detection and correction. modules, rather than by starting from
nisms, and error-detection modules that Similar messy and probabilistic intermedi­ scratch, and may not be the optimal way
delay subsequent events until a process has ates appear in engineering systems based on of solving a particular problem21. This
been successfully completed. These are pre­ artificial neural networks — mathematical evolutionary history is similar to that of
sent to different extents in different examples. characterizations of information processing man-made devices. Particular solutions in
Exploration with selection (trial and that are directly inspired by biology. A neural computing, or for any engineered object, are
error) is a fundamental principle of biology network can usefully describe complicated the result of an elaborate historical process of
and acts on timescales from milliseconds to deterministic input–output relationships, selection by technological, economical and
aeons and at organizational levels from sin­ even though the intermediate calculations sociological constraints. A familiar example
gle molecules to populations of organisms. through which it proceeds lack any obvious is the less than optimal QWERTY keyboard,
In single molecules, kinetic funnels direct meaning and their choice depends on ran­ originally invented to prevent jammed keys
different molecules of the same protein dom noise in a training process20. on early manual typewriters. It can be viewed
through multiple, different paths from the as a living fossil.
denatured state to a unique folded struc­ Constraints from evolution The survival of living systems implies
ture18. Within cells, the shape of the mitotic One approach to uncovering biological that the critical parameters of essential
spindle is partly due to selective stabilization design principles is to ask what constraints modules, such as the accuracy of chromo­
of microtubules whose ends are close to a they must obey. Apart from the laws of some segregation or the periodicity of a
chromosome19. At the organismal level, the physics and chemistry, most constraints arise circadian clock, are robust: they are insensi­
patterning of the nervous system is refined from evolution, which has selected particular tive to many environmental and genetic
by the death of nerve cells and the decay of solutions from a vast range of possible ones. perturbations. Evolvability22, on the other
synapses that fail to connect to an appropri­ Today’s organisms have an unbroken hand, requires that other parameters of
ate target. Within populations, differential chain of ancestors stretching back to the ori­ modules are sensitive to genetic changes.
reproductive success alters the structure of gin of life. This constraint has been success­ They can then be modified over many gener­
gene pools, giving rise to evolution. fully used to understand protein functions, ations to alter the function of a module, or its
The use of exploration with selection on by comparing existing protein sequences connections to other modules, in a way that
short timescales as a design principle in from related species, finding conserved parts allows organisms to adapt to new challenges.
intracellular modules may make them espe­ and inferring their roles. Comparing mod­ It is important to understand how robustness
cially easy to modify on evolutionary ules of common function from different and flexibility can be reconciled for each
timescales. The lack of a rigidly programmed organisms should be a similarly useful tool functional module.

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C51
impacts

Organisms have been selected for two of inputs to outputs of modules, their modelling and conceptual frameworks. The
properties: rapid reproduction in optimal biochemical connectivity, and the states of best test of our understanding of cells will be
conditions and the ability to survive rarely key intermediates within them. Three com­ to make quantitative predictions about their
encountered extreme conditions. Because plementary approaches can help in this task: behaviour and test them. This will require
environments tend to fluctuate over time, better methods for perturbing and monitor­ detailed simulations of the biochemical
most modules are likely to have been selected ing dynamic processes in cells and organ­ processes taking place within the modules.
for their ability to contribute to both repro­ isms; reconstituting functional modules But making predictions is not synonymous
duction and survival. These considerations from their constituent parts, or designing with understanding. We need to develop
imply that understanding the full function of and building new ones; and new frameworks simplifying, higher-level models and find
modules may require us to measure small for quantitative description and modelling general principles that will allow us to grasp
differences in reproductive ability, as well as of modules. and manipulate the functions of biological
studying the performance of modules under The first approach is illustrated by efforts modules. The next generation of students
extreme perturbations. Some components to find small organic molecules that can per­ should learn how to look for amplifiers and
of an in vivo module that are ‘nonessential’ turb and report on the activity of modules. logic circuits, as well as to describe and look
in normal laboratory conditions are likely to Calcium-binding dyes have been used to fol­ for molecules and genes (Box 3). Connecting
have important roles in the assembly, fideli­ low the activities of neurons with high spatial different levels of analysis — from molec­
ty, robustness and dynamic characteristics of and temporal resolution. Light-activated ules, through modules, to organisms — is
modules that produce small advantages in chemicals can perturb function on the essential for an understanding of biology
long-term survival probability. It may be timescales that characterize changes within that will satisfy human curiosity.
very difficult, however, to measure such the modules, thus giving them an important Leland H. Hartwell is at the Fred Hutchinson
contributions directly. advantage over the slower perturbations Cancer Center, Seattle, Washington 98109, USA.
Survival of a gene pool, as opposed to an produced by classical and molecular genet­ John J. Hopfield and Stanislas Leibler are in the
individual organism, is favoured by diversifi­ ics25. Another example of a new method for Department of Molecular Biology (J.H.) and
cation, as the simultaneous presence of mul­ monitoring cellular processes is genome­ Department of Physics and Molecular Biology
tiple phenotypes in a population increases wide analysis of gene expression26,27. But (S.L.), Princeton University, Princeton, New Jersey
the possibility that some individuals will sur­ techniques for collecting information about 08542, USA. Andrew W. Murray is in the
vive and reproduce in a heterogeneous and the entire genome will be only as powerful as Department of Physiology, University of California
changing environment. Diversification can the tools available to analyse it, just as our at San Francisco, San Francisco, California 94143,
be achieved by epigenetic mechanisms that ability to infer protein structure and func­ USA.
enable a single genotype to produce more tion from protein sequence data has 1. Stock, J. B. & Surette, M. G. in Escherichia coli and Salmonella:
than one phenotype, by genetic mechanisms increased with the sophistication of tools for Cellular and Molecular Biology (eds Neidhardt, F. C. & Curtiss,
R.) 1103–1129 (ASM Press, Washington, DC, 1996).
that maintain multiple genotypes in a popu­ sequence analysis. We need better methods 2. Herskowitz, I. Cell 80, 187–197 (1995).
lation, and by speciation, which splits a single of finding patterns that identify networks 3. Posas, F., Takekawa, M. & Saito, H. Curr. Opin. Microbiol. 1,
gene pool into two independently evolving and their components, of identifying possi­ 175–182 (1998).
4. Hsu, H. et al. Biophys. J. 65, 1196–1206 (1993).
pools. It is thus important to consider the ble connections among the components,
5. Waxman, D. & Peck, J. R. Science 279, 1210–1213 (1998).
function of modules not only in the context and of reconstructing the evolution of 6. Konopka, J. B., Jenness, D. D. & Hartwell, L. H. Cell 54,
of an organism, but also from the point of modules by comparing information from 609–620 (1988).
view of the population of organisms, and to many different organisms. 7. Goy, M. F., Springer, M. S. & Adler, J. Proc. Natl Acad. Sci. USA
74, 4964–4968 (1977).
ask how modular construction facilitates the Another approach to discerning module 8. Barkai, N. & Leibler, S. Nature 387, 913–917 (1997).
maintenance and selection of diversity. function is that of ‘synthetic biology’. Just as 9. Anderson, P. W. Science 177, 393–396 (1972).
chemists have tested their understanding of 10. Belmont, L. D., Hyman, A. A., Sawin, K. E. & Mitchison, T. J.
Towards modular biology synthetic pathways by making molecules, Cell 62, 579–589 (1990).
11. Gliksman, N. R., Parsons, S. F. & Salmon, E. D. J. Cell Biol. 119,
A major challenge for science in the twenty­ biologists can test their ideas about modules 1271–1276 (1992).
first century is to develop an integrated by attempting to reconstitute or build func­ 12. Alberts, B. & Miake-Lye, R. Cell 68, 415–420 (1992).
understanding of how cells and organisms tional modules. This approach has already 13. Shapiro, J. A. Ann. NY Acad. Sci. 870, 23–35 (1999).
14. Hopfield, J. J. J. Theor. Biol. 171, 53–60 (1994).
survive and reproduce. Cell biology is in been used to construct and analyse artificial
15. Bray, D. Nature 376, 307–312 (1995).
transition from a science that was preoccu­ chromosomes made by assembling defined 16. Morgan, D. O. Annu. Rev. Cell. Dev. Biol. 13, 261–291 (1997).
pied with assigning functions to individual DNA elements28, and cellular oscillators 17. Berg, H. C. Cold Spring Harb. Symp. Quant. Biol. 53, 1–9
proteins or genes, to one that is now trying to made from networks of transcriptional (1988).
18. Dill, K. A. & Chan, H. S. Nature Struct. Biol. 4, 10–19 (1997).
cope with the complex sets of molecules that regulatory proteins (M. Elowitz and S. L., 19. Heald, R. et al. Nature 382, 420–425 (1996).
interact to form functional modules23,24. unpublished results). Seeing how well the 20. Sejnowski, T. J. & Rosenberg, C. R. Complex Systems 1, 145–168
There are several questions that we want to behaviour of such modules matches our (1987).
answer. What are the parts of modules, how expectations is a critical test of how well we 21. Jacob, F. Science 196, 1161–1166 (1977).
22. Kirschner, M. & Gerhart, J. Proc. Natl Acad. Sci. USA 95,
does their interaction produce a given func­ understand biological design principles. 8420–8427 (1998).
tion, and which of their properties are robust The main difficulty in reconstructing the 23. Nurse, P. in Limits Of Reductionism In Biology Ciba Foundation
and which are evolvable? How are modules evolution of modules is our ignorance about Symp. 213, 93–101 (1998).
24. Strohman, R. C. Nature Biotech. 15, 194–200 (1997).
constructed during evolution and how past events. One solution to this problem
25. Adams, S. R. & Tsien, R. Y. Annu. Rev. Physiol. 55, 755–784
can their functions change under selective is to examine the evolution of module func­ (1993).
pressure? How do connections between tion in the laboratory. Analysing multiple 26. Schena, M., Shalon, D., Davis, R. W. & Brown, P. O. Science 270,
modules change during evolution to alter the repetitions of such experiments may tell us 467–470 (1995).
27. Brown, P. O. & Botstein, D. Nature Genet. 21, 33–37 (1999).
behaviour of cells and organisms? how much the path of future change is 28. Murray, A. W. & Szostak, J. W. Nature 305, 189–193 (1983).
The number of modules that have been restricted by the current structure and 29. Hodgkin, A. L. & Huxley, A. F. J. Physiol. 117, 500–544 (1952).
analysed in detail is very small, and each of function of modules, and should help us to 30. Herskowitz, I. & Hagen, D. Annu. Rev. Genet. 14, 399–445
these efforts has required intensive study. understand how evolutionary pressures (1980).
31. Lwoff, A. Bact. Rev. 17, 269 (1953).
Biologists need to study more functions at constrain biological design principles. 32. Ptashne, M. Nature 214, 232–234 (1967).
the modular level and develop methods that Finally, we emphasize the importance of 33. Ptashne, M. et al. Cell 19, 1–11 (1980).
make it easier to determine the relationship integrating experimental approaches with 34. McAdams, H. H. & Shapiro, L. Science 269, 650–656 (1995).

C52 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
timescales

Flashes in femtoseconds
For the molecule, time ticks in picoseconds, or even femtoseconds — respectively 10–12 and 10–15 s. On such timescales, a molecule tumbles through space, an
atom vibrates back and forth in a crystal, a chemical bond is forged or broken. Yet the development of pulsed lasers blinking in a femtosecond staccato of flashes has
enabled these fundamental molecular processes to be charted, revealing the rise and decline of ephemeral intermediates under the furiously strobed illumination.
Femtosecond spectroscopy has shown us the dynamics of bond breaking — but what of the structural aspects? When will we be able to map out
the coordinates of individual atoms, as, for example, a protein docks onto the cell surface or as a chlorine radical eliminates an ozone
molecule? When will we see the first atomically resolved diffraction pattern of a true transition state, and watch frame by frame as
vibrations carry atoms into new unions?
This demands a marriage of the techniques of ultrafast spectroscopy with those of X-ray diffraction. The probe
beam becomes a pulsed X-ray source, its flashes brief enough to ‘freeze’ the atomic motions yet bright enough to provide a
discernible diffraction pattern. Current synchrotron sources can be pulsed at around 50 picoseconds, whereas ultrashort
laser pulses can excite bursts of X-rays from suitable targets that could, in principle, be over within 100 femtoseconds.
These pulses must be synchronized with an optical pumping beam, which gets the reaction underway. An
alternative is to diffract electrons rather than X-rays. Diffraction on the ‘molecular’ timescale of femtoseconds is an
infant discipline which promises wonders once perfected, but which is capable right now of only the crudest of
impressionistic sketches: blurred images of lattice dynamics, showing evidence of rapid change but without a single
molecule (let alone an atom) in focus. The static photography of the Braggs has yet to produce its first movie. Philip Ball

Arrhythmias
‘His heart raced’. ‘Her heart skipped a beat’. ‘My heart beat sluggishly in my breast.’ Literature is peppered with such lines. Why?
Because, as many a dramatic writer has noted, a disturbance in that most basic ‘rhythm of life’ — the normal human heart rate of
between 60 and 100 beats per minute — is a sign that something is amiss.
Normally a heartbeat starts in the right atrium of the heart, in a specialized group of pacemaker cells known as the sinus node.
This node sends an electrical signal, which spreads through the atria and ventricles of the heart, following a precise route that
induces the heart to beat in the most efficient way.
If the sinus node develops an abnormal rate or rhythm, if the conduction pathway of its electrical signals is faulty or if another
part of the heart takes over as the pacemaker, arrhythmias — as the various types of irregular heartbeat are generically known
— can ensue. Arrhythmias may be due to heart disease, ageing, medications, metabolic imbalances or other genetic or infectious
medical problems. And the ineffective blood pumping that they give rise to can cause fatigue, dizziness, light-headedness, fainting
spells and, ultimately, strokes and death.
Oscillations in the cells of the sinus node operate by setting up a cyclic decay of the difference between the charge on the
outside of the membrane and the inside. This causes the membrane to fire spontaneously, which raises its conductance and allows
electrolytes to rush in, thus reversing the potential across the membrane. This reversed charge propagates to neighbouring cells,
causing the same sequence of events therein and hence a concert of muscular contraction.
But it is currently unclear exactly how cyclically expressed genes and proteins give rise to cyclic decay in
membrane potential in the heart’s muscle cells. So, just as in many other areas of medical biology, the challenge now is
twofold. First, to expand our knowledge of the genetics and cell signalling processes that gives rise to healthy heartbeats. And
second, to integrate this into a far richer picture of the workings, and failings, of the whole heart. Sara Abdulla

Does the past have a future?


Hardly a week goes by without the discovery of some amazing fossil that will change the way we look at the history of life. But
how long can the fossil bonanza last? After all, the Earth has only so many rocks to investigate. Will there come a time
when we can say that the fossil record is substantially complete, in that the likelihood of discovering radically new forms, or
significant range extensions of known forms, becomes improbable? If so, when will that time arrive? And when it does, how will
this influence the practice of palaeontologists?
“I think we are more or less at that point now,” says Andrew Smith of the Natural History Museum in London. “The
available outcrops are pretty heavily searched within the developed nations, and major headway has been made in exploring
places like Mongolia and Madagascar.” His prognosis is, however, tempered by the fact that absolute completeness is an
impossibility, because the vast majority of organisms do not end up as fossils: “my belief is that we will have a
substantially biased fossil record that cannot be bettered no matter where we go in the world”.
Charles Marshall of Harvard University agrees that the Earth’s fossil riches have been fairly well worked out, but says
that one cannot discern a point when the last fossil will have been extracted. “One can’t simply extrapolate from what
has been discovered because radical new finds often result from unanticipated ways of looking, often in stratigraphic
intervals or rock types not typically examined. If the rate of recent discoveries, from fossil embryos to down­
covered dinosaurs, are any indication there is much to be discovered yet!”
However, contrary to popular iconography, most palaeontologists do not devote most of their
time to making radical discoveries. Such discoveries, says Marshall, will be — as they are now —
“largely the unexpected by-products of mature research programmes designed to establish the
patterns of evolution and elucidate the processes responsible for those patterns.”
“The obvious priority,” says Smith, “will be to establish and understand how sequence stratigraphy and sea-level change
control facies preservation, and thus control apparent biotic diversity and extinction patterns through time. Only then will we begin to get a reliable handle on how
diversity has changed over time.” Henry Gee

C30 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
timescales

Keeping time

How finely can time be measured? The answer might define our capacity to conduct precise tests of general relativity, as demonstrated already in the observed
increase in clock rate with distance from the Earth’s surface, or the measurement of relativistic effects in binary pulsar systems. Spacecraft navigation,
interferometric radio astronomy and Global Positioning System geodesy are all dependent on timekeeping that taxes current technology. And accurate
measurements of the second define other standards, such as the units of distance and voltage.
Measuring time is about monitoring oscillations, whether of the pendulum, the quartz crystal or the caesium atom. By relying on
atoms for our frequency (and hence time) standards, we may at least avoid placing ourselves at the mercy of the machinist who
made the device; but on the other hand, we fall foul of the Uncertainty Principle. Atomic transitions, of which the hyperfine
splitting of energy levels in caesium-133 furnishes us with today’s standard second, happen at a frequency that can
ultimately be well defined only to the degree that the excited state is long-lived. The shorter the lifetime — or the duration
of the measurement — the greater the uncertainty in energy (and frequency) of the transition.
So to get a more accurate result, one needs a more leisurely attitude to measurement. This is the
principle behind the ‘atom fountain’, a device that first cools atoms within an optical ‘molasses’ of three crossed laser
beams before gently launching them vertically so that they fall back under gravity. The atoms can then be probed
outside the perturbing influence of the laser field.
Towards the top of the fountain, the atoms pass twice through a microwave cavity: once on the way up, once on
the way down, separated by about a second. Invented by Norman Ramsey in 1949, this ‘separated oscillatory-field
method’ is applied to a linear atomic beam in current atomic clocks. But the atom fountain permits a much longer span
between pulses, and so a greater narrowing of the transition. In principle these devices offer a frequency determination
accurate to one part in 1016, which is equivalent to a variation of about one second in a billion years. Philip Ball

Circadian clocks
How do organisms respond to their most fundamental timescale — the Earth’s rotation around its own axis? With a
suite of interconnected hormonal and physiological loops that make up the ‘circadian timing system’ or CTS.
Every CTS — whatever the organism — is made up of light receptors, an endogenous ‘biological’ clock and a so-called
‘entrainment’ system to couple one with the other. For although every cell of every biological or circadian clock behaves cyclically,
these intrinsic periods are not necessarily exactly 24 hours long. The clock has to be set.
Much of the pathology and endocrinology of the circadian clock has been elucidated in the past century. In humans, for
instance, many sections of the route from retina to brain to heart-muscle action, metabolism, body temperature and the
organization of sleep–wake cycles have been charted. But there are still major areas of uncertainty, especially in the newest
branch of circadian biology: the identification and study of genes necessary for normal timekeeping.
Clock genes seem to be present in all cells, even though only a subset is capable of acting as self-sustained oscillators.
So do these genes have other, global, non-circadian functions in cell biology? As Russell Foster of Imperial College, London, puts
it: “cellular assays have shown the way forward, now we need good functional analysis using transgenic approaches and
behavioural studies”.
Moreover, only a few animal systems have been studied so far in any detail — the mouse, the fly and the fungus,
Neurospora. This has revealed some remarkable similarities but also some exciting differences. Comparisons of additional
species will surely tell us more about the evolution and adaptive significance of circadian systems and their interactions with
their environments. All of which will hopefully feed back into a fine-tuned understanding of, and ability to treat or even to
manipulate, the one timepiece that we all care about — the human body. Sara Abdulla

The challenge of conservation


Some disciplines move faster than others, but if there is one moving much faster than its practitioners would like,
it is conservation biology. In the next century, conservationists will be less interested in pristine wilderness — for such will
hardly exist — but in the success with which human beings manage the biosphere to the benefit of all its inhabitants, including
humans. The world is currently in an unprecedented situation in which Homo sapiens — just one species out of millions —
sequesters 45 per cent of the planet’s net terrestrial productivity and 55 per cent of its fresh water. Effective biodiversity systems
management will become an imperative if only to maintain the quality of human life, let alone that of other species.
In the coming decades, conservation biology will become a close analysis of the art of compromise, as humanity seeks
to understand the global ecosystem as a participant, not as an observer. “Within a century,” says Peter Raven, director of the
Missouri Botanic Garden, “the world will be a patchwork with varying degrees of biological richness, and the way people
respond locally to challenges will determine the nature of that compromise.”
What will we have lost by 2100? Raven estimates that two-thirds of all species will disappear if current trends
continue, but is nevertheless optimistic. “We can do better,” he says. “No major category of
organisms, or local community of organisms, need disappear completely.”
Ecologists are acutely aware that the maintenance of ecosystem function transcends the
conservation of any particular species, so high-technology initiatives such as ‘ex situ’ conservation
— maintaining banks of germplasm outside the context of a habitat — may become a side issue.
“High technology is unlikely to solve most of the major global environmental issues that we face,”
says David Tilman of the University of Minnesota. “Rather, lifestyle changes, most likely imposed by legislation, will be needed. Such legislation will be socially
sustainable only once citizens understand their long-term dependence on biodiversity and ecosystem services.” And Raven adds that it is “better for now to follow
Aldo Leopold’s dictum: ‘the first rule of intelligent tinkering is to save all the cogs and wheels’”. Henry Gee

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C17
timescales

Physics at the Planck time


Superstring theory, perhaps the most notorious attempt to unify general relativity with quantum mechanics, was once described as a piece of
twenty-first-century physics that fell into the twentieth century. Certainly, physicists struggling to marry these reluctant partners have long sensed that the
crucial piece of the puzzle has yet to be invented. But a theory of quantum gravity remains more a question of tidiness than anything else — quantum effects on space­
time become important only on time and length scales well out of experimental reach. A quantum field theory of electromagnetism has plenty of
practical relevance for applied physics, and such a field theory for the nuclear forces generates no shortage of testable predictions. But
gravity yields to the quantum rod only at the so-called Planck scale: over distances of about 10–33 cm and times of 10–43 s.
A physics to match the Planck timescale is the biggest challenge to physicists in the coming century. This is the
timescale relevant to the graviton, the putative carrier of the gravitational force. And naturally enough, only such a
theory can extend our understanding of the Big Bang inside the first 10–43 seconds of existence.
What will a theory of quantum gravity do to space-time? Will it be like John Wheeler’s ‘quantum foam’, furiously
alive with ephemeral black holes and worm holes? Or will it be more like Abhay Ashtekar’s fabric of woven loops? Will
particles become strings, and will the strings be supersymmetric? How many extra dimensions will turn up, curled out
of sight over the Planck distance?
And will we ever be able to test such a theory? One quantum-gravity researcher has confessed that “it seems
highly unlikely that a machine will ever be built with which these minute distances can be studied directly”. For
guidance, it seems that physicists may have to fall back on the old stand-bys: elegance and beauty. Philip Ball

The speed of computers


One version of Moore’s Law, predicted by Intel co-founder Gordon Moore in 1965, has it that computer chips double in speed every
18 months. Nothing moves any faster, of course — the chips simply become more powerful by packing more devices into the same
space. Speed is about miniaturization. Recent research on the ultimate size limits for silicon technology have fuelled
speculation about whether Moore’s Law is headed for a crash. By 2012, at the current rate of shrinkage, transistors will
have become so small that the silicon oxide films will no longer guarantee adequate insulation between conducting regions: they
become switches that cannot be fully turned off. What happens then to our expectation that the next generation of machines will
be better, meaner and faster?
All manner of factors determine how quickly, say, the Nature web page downloads onto your screen, many of them scarcely
connected to the component density on the chips inside the processor. In some parts of the world it can take ten minutes or more.
Yet you would be waiting a lot longer if the data were streaming down old copper telephone cables rather than glass optical fibres.
Fibre-optic networks have a greater carrying capacity that speeds up transmission by one or two orders of magnitude relative to
electrical lines. But that is only a fraction of what should be possible with photonic technology.
And the electronics at either end are soon to be stretched to their limit to cope with the transmission speeds that photonics
offers. Gigabit-per-second distributed networks have already been demonstrated, and laser diodes can in principle blink out 100
Gbits per second; but electronics lose the ability to cope at half that rate. All-optical networks will then be required, and
networks working at 100 Gbits per second are already in development. At some point, the light will need to go on a
chip, in photonic integrated circuits.
But photonics is an embryonic technology, and plagued with obstacles in comparison with mature electronics. No clear rival
to the transistor has yet emerged from this or any other direction, such as single-electron devices and molecular electronics. So
perhaps we will have to adjust our expectations as the information revolution reaches a plateau. Moore predicts that we have
about two generations of computers left with current technologies, and “beyond that, life gets very interesting”. Philip Ball

Lifespan extension
Eat more fibre! Consume less cholesterol! Drizzle more olive oil! Have fewer children later in life! Drink a glass of red wine a
day! And so it goes on. Rarely a day goes by without the emergence of a new ‘theory’ in that most seductive realm of scientific
advances — lifespan extension.
Chromosome structure hints that our cells have genetic ‘fuses’ that mete out their allotted time. Antibiotics, vaccinations
and antiseptics have brought about hikes in life expectancy in the developed world which have raised the average lifespan to
almost double that at the last fin de siècle — nigh-on 80 years and still rising — as this century draws to a close. The
intractability of several cancers such as Hodgkin’s lymphoma and leukaemia has been tamed and many others look to follow
soon; the hard nut of cognitive decline is on the verge of being cracked with drugs and neuron replacement; and workable
synthetic organs and xenotransplants are around the corner.
But the current obsession with all things genetic, molecular, high-tech and expensive risks missing the point that
both the World Bank and World Health Organization (WHO) make patently clear. That is, for the majority of the
planet the most powerful lifespan-extending technologies of all — good, basic public health and
preventative medicine — are still a far-off dream, even if they are, scientifically speaking,
old hat.
As we usher in the third millennium, the WHO estimates that the citizens of the world’s poorest
countries (who make up two-fifths of the global population) have almost half the life expectancy —
at under 50 years — of those living in the richest nations. And that gap is widening, not narrowing,
often in both directions. One third of the planet’s population do not have secure access to safe water or sufficient food, let alone
drugs, immunizations, or even the simplest health education. Elsewhere, billions of dollars are being spent teasing out the intricacies of gene therapy and memory
enhancement. Addressing this inequity is surely one of the greatest and most pressing challenges facing us all, if not as academics, then as people. Sara Abdulla

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C61
impacts

Computing 2010:
from black holes to biology
Declan Butler

By 2010, a click on the PC on your desktop will suffice to call up instantly all
the computing power you need from what by then will be the world’s largest
supercomputer, the Internet itself. Supercomputing for the masses will
trigger a revolution in the complexity of problems that are tackled, whole
disciplines will go digital and, rather than spending time collecting their own
data, scientists will organize themselves around shared data sets.
“PAL, I’m ready; set up that conference call.” wall. An unequivocal first; a strong burst of of a privileged few, but the routine of the
The teraflop PC in George’s Chicago laborato­ gravitational waves in the last milliseconds of masses. A vision is emerging of a global com­
ry obeys. Detectors stir and projectors purr. the collision, exactly as predicted by Einstein’s puting grid, where computers of any shape
Ceiling, walls and floor shimmer sapphire. The theory of relativity. and size are linked across the Internet to
virtual-reality cavern ready, PAL connects to The excitement builds. PAL relays even bet­ assemble giant distributed supercomputers.
the Internet at ten gigabytes per second, beam­ ter news — that the bursts also have a specific Scientists in every discipline will, from their
ing in three-dimensional avatars of Juliette in waveform — to its counterpart machine at desktops, have access to raw computing
Paris and Günther in Munich. The three shake VIRGO, the European gravitational-wave power that dwarfs today’s most powerful
hands. The experiment begins. detector in Cascina, Italy. VIRGO had come supercomputers.
Elsewhere on the Internet, PAL is busy inter­ on-line in 2008, and its kilometres-long laser Seidel admits that simulations of black­
preting and assembling data sets from research interferometers had amassed a haystack of hole collisions are still far from being sophis­
centres across the globe, while simultaneously data; but finding the needle — tell-tale vibra­ ticated enough to provide realistic help to the
negotiating CPU time from a supercomputer in tions a tiny fraction of an atomic nucleus in planned European VIRGO (Variability of
New Mexico, and petaflop clusters of PCs in magnitude — had proved elusive. VIRGO Solar Irradiance and Gravity Oscillations)
London and Prague. Operation accomplished. now had a vital clue as to what to look for. and US LIGO (Laser Interferometer Gravity­
A kaleidoscope cloud of millions of glowing pix­ The scientists pace for what seems an eter­ wave Observator) gravitational-wave obser­
els suddenly coalesces in the centre of the labora­ nity, to be interrupted by on-line gatecrashers vatories. But he predicts that they soon will
tory, sculpting two floating black holes that swirl from VIRGO, popping champagne. PAL’s be, as advances in computing power and
slowly around one another. waveform template matches blips buried in the algorithms enable such complex simulations
Koichi, the principal investigator, takes noise of VIRGO’s massive data sets. The sig­ to model reality for the first time.
control of the input parameters, steering a col­ nal-to-noise ratio adjusted, the blips even yield The wide availability of massive comput-
lision of the black holes using a handheld a first guess at the mass of a pair of real black

AEI/NCSA/UNIV. WASHINGTON
screen from his bedroom in Tokyo. One of the holes. PAL generates a virtual draft manu­
most violent events in the Universe unfolds in script, which is passed around, quickly edited
fast-forward in front of the scientists, as the and passed back. The computer rapidly ranks
black holes whirlpool in the final plunge, merge the citation and media impact of the publica­
and mushroom. tion in the EMBIH global literature repository
Within minutes, PAL has crunched the ter­ as well as the handful of journals that now
abytes of output. “Back in the 1990s, it would exist, and — showing signs of intelligence —
have taken months,” muses George to himself. decides to e-mail the paper to Nature. All in a
The results scroll down the high-resolution day’s work in 2010.

J
ournalistic licence? Certainly. Fiction? virtual reality, modelling and visualization
Probably not. The core science of the will be routine over tomorrow’s Internet. A
black-hole collision, and much of the proprietary virtual-reality theatre, CAVE,
Internet technology, was demonstrated in under development at the Electronic Visual­
June this year, when a team led by Ed Seidel of ization Laboratory of the University of Illi­
the Albert Einstein Institute in Potsdam, and nois at Chicago, is one of several systems for Black-hole simulation showing a just-formed
Wai-Mo Suen of Washington University in St remote collaboration — or ‘tele-immersion’ larger black hole that resulted from the collision
Louis, broke supercomputing records with a — that are up and running. Nor will using of two smaller black holes. The surface of the
140,000-CPU-hour run that produced such technologies require booking scarce newly formed hole is shown as a coloured
almost a terabyte of data, using the 256­ run-time on supercomputers in advance; a surface in the centre of the simulation. This
processor ‘Origin2000’ at the US National click on the PC on your desktop will suffice to surface is semi-transparent, allowing one to see
Center for Supercomputing Applications call up instantly all the computing power you the two original holes inside. A burst of
(NCSA). need, from what by then will be the world’s gravitational waves results from this collision,
Prototype systems emerging from the largest supercomputer, the Internet itself. and these waves are shown as the red-yellow
world’s most advanced computing and net­ By 2010, today’s top-end computing wisps emanating from the central region.
working test beds suggest that such high-end promises to be no longer the exotic pastime
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C67
impacts

ing power to help tackle complex systems

EVL, UNIV. CHICAGO AT ILLINOIS/PYRAMID SYSTEMS


will prompt a profound change in science
itself, predicts Ruzena Bajcsy, assistant
director of the US National Science Founda­
tion (NSF)’s directorate for computer and
information science and engineering. She
believes it will close the chapter of Descartian
reductionism in research, and usher in a new
era where researchers will increasingly take
on the understanding of complex dynamic
systems, such as whole cells and the Earth. To
advance the global grid, NSF is funding
prototypes for scientific use — through the
National Partnership for Advanced Compu­
tational Infrastructure, a consortium led by
the San Diego Supercomputer Center — and
for the population at large — through the
National Computational Science Alliance,
an NCSA-led consortium of over 50 research
agencies and universities.
Conventional supercomputers will with­
out doubt become major nodes on the grid at
least at the beginning, and linking even a few
of these would greatly increase the resources
that could be brought to bear on individual
problems. Petaflop supercomputers that are
roughly 1,000 times faster than today’s top
teraflop (1012 floating-point operations per
second) machines are expected soon after CAVE virtual-reality theatre.
2010 — and perhaps before if current
research on technologies such as ‘processor
in memory’ and the use of superconducting Many scientific applications and high­ large computer, says George Lake, a scientist
parts bears fruit. resolution graphics require 64-bit floating­ at the University of Washington and NASA
But supercomputers are expensive. The point accuracy — today’s 32-bit PCs round project scientist for high-performance
world’s most powerful supercomputer, Asci numbers off too imprecisely for many tasks. computing in Earth and space science. (The
Red, at Sandia National Laboratories in New The arrival of the 64-bit chip blurs further 128-bit chips on the horizon would allow the
Mexico, which distributes tasks across an the differences between Beowulfs and con­ user to address a number equivalent to every
array of around 10,000 Pentium Pro chips, ventional supercomputers. Being able to link atom in the Universe with space left over.)
cost almost US$60 million to build. A decade up cheap 64-bit PCs for scientific purposes “Think ahead ten years,” says Smarr, “and
ago, Tom Sterling, who works at Caltech and will open up cluster supercomputing you can see the supercomputer sort of dis­
NASA’s Jet Propulsion Laboratory, thought to many more laboratories, while the solving into this vast planetary fabric of com­
that a cheaper way forward might be to sim­ movement will also be encouraged by the puting because all [computer technology]
ply string together lots of PCs. Sterling built arrival of gigabit-per-second Ethernet local will be built on the same commodity proces­
his first ‘Beowulf ’ cluster supercomputer by area networks. sors, operating systems and interconnects.”
taking a then-barely-heard-of open-source The designers of tomorrow’s Internet Proof of principle for the use of PCs over
operating system, Linux, and adding net­ think they might be able to take the idea the Internet comes from the Search
work drivers to it. The project worked, cost a further. If you can combine the computing for ExtraTerrestrial Intelligence screensaver
tenth of the price of a conventional super­ power of PCs over a local network, then why SETI@home, a downloadable screensaver
computer and was also highly flexible — if not also over the high-speed network that will which has become the largest distributed
more power was needed then more PCs were be tomorrow’s Internet? “Asci Red has 10,000 computing project in history. Over 1 million
added. ‘Avalon’, a Beowulf at the Los Alamos Pentium Pros; link up PCs on the Internet users (including this author) have donated
National Laboratory, ties together 140 PCs and you could link up tens of millions,’ says spare CPU time on their PCs and Macs to
and carries out 50 billion floating-point cal­ Larry Smarr, director of the National search for signals from extraterrestrials in the
culations per second for a construction cost Computational Science Alliance. “We will 35 gigabytes of data produced daily from the
of a mere $300,000. The European Southern have the accuracy we need for these large Arecibo radio telescope in Puerto Rico . The
Observatory at La Silla, Chile, and LIGO complex simulations but at the price of PCs”. result is the planet’s largest virtual supercom­
have also chosen Beowulfs to meet their Indeed, the characteristics of 64-bit puter, running at 7 teraflops.
supercomputing needs in data processing chips, in principle, make them good candi­ David Anderson, a computer scientist at
dates for such a feat. Although today’s 32-bit the University of California, Berkeley, who
The 64-bit question PC chips can address 232 other items, which leads the project, says that he now intends
The Beowulf concept now looks set to take off in terms of memory represents 2 gigabytes, to use the same approach to tackle other
in an even bigger way: on the Internet. Intel’s 64-bit chips can address 264, which in distrib­ supercomputing tasks including rational
latest chip, the IA64, or Merced — which is uted computing terms means that one drug design, protein folding and high-reso­
due to be shipped in 2000 — is more than just processor could in principle address the lution graphics. A web-based group,
another faster Pentium. What makes Merced memory of every other computer on the ‘distributed.net’, has also made a name for
so special is that it is the first commodity PC planet. Link such PCs over a high-speed itself by using distributed computing over
chip to have a 64-bit architecture. Internet and it would in effect become one PCs to solve cryptography keys. But Sterling
C68 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts
remains cautious about the prospect of com­ works out at a handful of processors per user. distributed clusters of PCs or supercomput­
modity PC chips becoming a major source of “The peer-review system to select projects is ers. Cactus’s modules also allow booking of
supercomputing power over the Internet, a polite term for ‘rationing’,” quips Smarr. CPU time at remote centres, handle the
despite its extraordinary potential. He points Over half the top projects get rejected but output of terabytes of data, and also offer a
out that the speed at which the memory is Smarr is optimistic that over the next decade suite of state-of-the-art three-dimensional
addressed — or latency — is crucial for many we will move from “an era of scarcity to an visualization tools. The NSF has committed
applications. At present, the fast intercon­ era of plenty.” $2.2 million to developing Cactus in the
nects needed are found only in dedicated Attention is now turning to how best to United States. The code is accessible to the
supercomputers. The connections between make such distributed resources available to public at http://www.cactuscode.org.
clusters of PC operating over local area users. The possibilities are enormous. Imag­ That is a sea change. In the past, super­
networks are still too slow or unreliable for ine sitting at your screen in front of a simple computing algorithms have generally
applications requiring tight coupling, let Yahoo-type interface, customized to your been developed for specific problems, and
alone clusters operating over the network. personal research requirements. One click therefore could not be used by others.
Faster interconnects produced by Giganet and databases around the world are scoured Cactus will allow biologists, for example, to
Inc. have produced tighter coupling in a 256­ for the data you want, say spectral lines of sev­ draw on decades of experience of their more
processor cluster unveiled in October by the eral classes of stars. Little matter what format mathematically inclined colleagues,
Advanced Cluster Computing Consortium, the image bank databases are in; the interface physicists. “As the network and network
an organization based at Cornell University uses a standard format and automatically software evolves I think the division
which includes Microsoft, Dell and Intel. The converts data having other formats. With the between the PC on your desk and the assets
project, based not on Linux but on Windows data collected and merged, another click and the networks will offer will become steadily
2000, is a commercial bid to standardize the interface negotiates with computers less apparent,” predicts Van Houweling.
reliable cluster computing systems for super­ around the world to book in real time all the “Today, scientists mainly use the web to find
computing in research and business. CPU time you need. Want a Fast Fourier information. In the future, accessing large­
Latency is a wider problem in computing. Transform? One keystroke does it. If you wish scale computing resources over the net will
The time required to fetch something from to view the results in three dimensions, virtu­ be the major use.”
the memory of say a Cray supercomputer — al reality or rotate them, click on a menu Such ‘middleware’ is beginning to appear
300 nanoseconds — is not much less than offering advanced visualization techniques. in biology. Even the most basic genomic
half what it was 20 years ago. Ultimately, this And all translated analyses often require users to run a plethora
NCSA, UNIV. ILLINOIS AT URBANA-CHAMPAIGN

means that significantly greater speeds will into your own native of different software and data formats.
not necessarily follow from adding more language. NCSA’s ‘Biology Workbench’ and Baylor
transistors on a chip. Putting the central Such interfaces to a College of Medicine’s ‘Search Launcher’ are
processing unit in the memory is the only suite of tools written two prototype systems designed to replace
feasible option, according to many computer specifically to these with user-friendly packages, along the
scientists. exploit the distrib­ lines of Microsoft Office. Behind the inter­
What seems clear is that as soon as the uted database of the face lies software that converts requests into
high-speed Internet becomes available, Internet and invisi­ the formats used by the various existing
distributed computing in some form is going ble to the user — or genome analysis tools, allowing users to
to come of age, probably involving a mix of ‘middleware’ — are screen multiple databases on the web as if
supercomputers, clusters of PCs and work­ now the focus of they were one. But Andy Baxevanis, director
stations, and individual PCs. The backbone development within of Computational Genomics at the National
of the US research network currently runs at many advanced Institutes of Health’s National Human
2.5 gigabytes per second. But Douglas E. Van Larry Smarr foresees Internet engineer­ Genome Research Institute, which is itself
Houweling, president of the University supercomputers being ing projects such as about to release a similar system, warns
superseded by a vast
Corporation for Advanced Internet Devel­ Internet 2. By 2010 against users becoming too dependent on
planetary fabric of
opment (UCAID), which runs the US they will be a routine interfaces as a “black box, where they
computing.
Internet 2 project, predicts that the network part of the infra­ stick sequences in, get the results out, and
will run at terabyte levels by 2010, delivering structure of working don’t know what the underlying method
to gigabit desktop connections. scientists, according to Rick Stevens, head of actually is”. Baxevanis argues that “there is no
the mathematics and computer science divi­ substitute for actually understanding the
Power hungry sion at Argonne National Laboratory. individual methods, whether you use
Brute-force computing is badly needed. them individually or in the context of a
“With 100,000 sequences of 600 bases pour­ Plug and play supercomputing workbench”.
ing out every day and needing to be screened A big step in this direction is the recent
against all others, we could easily use ten release of Cactus, a novel software tool kit Collaborating in cyberspace
times the computing power right now just developed by Joan Massó and Paul Walker, at “By 2010, the relationship of scientists with
for sequencing,” says Craig Venter, chief the Max Planck Institute for Gravitational their computers will change; they will see it as
executive officer of Celera Genomics Corpo­ Physics in Potsdam. It allows scientists to an information and computing fabric that
ration. “If you do not have this kind of combine supercomputers over a high-speed they have customized for the work they are
[high-end] computing capacity in biology, network without having to know anything doing, and a window to their colleagues,”
tomorrow you are going to get left behind.” about the necessary advanced computing says Van Houweling. Greater collaboration
Demand for supercomputing power at techniques. It is plug and play in that users will emerge naturally as the telephone, video
the NSF is growing exponentially and will simply take their own application (or and virtual reality become commonplace on
pass 10 teraflops by 2005, with most coming ‘thorn’) — be it written in Fortran or C or the Internet. Smarr foresees “a persistent
from new users. The National Computation­ whatever — and plug it into the flesh of the cyber café where you walk into a room with
al Science Alliance has 1,536 processors for Cactus code. The Cactus automatically ‘par­ your world-wide colleagues.”
scientific supercomputing, but Smarr points allelizes’ their program to run on virtually But there is more to collaboration than
out that with around 300 users a month, that any computer system, from a portable PC to sharing a virtual tea-break with colleagues
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C69
impacts

on the other side of what will be the world’s largest supercomput­ such systems. US vice-president Al Gore has
the world. A decade er for biology, a 1,200-processor machine. “A recently launched a plan for federal agencies
ago, scientists computer will be the biologist’s number one to carry out a ‘Digital Earth’ survey, the
studying the upper tool,” predicts Venter. “The data sets are stated goal of which is to create “a virtual
atmosphere tended beyond the capacity of the human brain.” representation of our planet that enables a
to build their Gene Myers, who leads Celera’s computa­ person to explore and interact with the vast
research agendas tional biology group, has cut the time it takes amounts of natural and cultural informa­
around their own to piece together the Haemophilus influenzae tion gathered about the Earth”. Modelling
particular instru­ genome from days to just 5 minutes. He is parts of the Earth such as the ocean or
ment, be it a satel­ now turning to the task of developing atmosphere is difficult enough; the thought
lite or radar. This sophisticated algorithms to analyse the of coupling them all into a working model
hardly made much human genome, beginning with whole­ of Earth is sending software engineers
sense given that the genome comparisons. He then intends to back to their drawing boards. “It makes the
community was overlay and integrate gene expression data. Sloan Digital Sky survey look easy,” says one
Computing will be the studying the same A pointer as to where computational scientist (see Nature 399, 520; 1999).
biologist’s number one zone. Dan Atkins biology is likely to go is a software package Scientists like writing code, but software
tool in the future, at the University called E-cell, developed by Masaru Tomita in the future is likely to be built using higher
predicts Craig Venter. of Michigan has led from Keio University in Japan. The package, levels of abstraction. Much current inspira­
an Internet-based which can be downloaded from his website tion is coming from A Pattern Language, a
effort called the at http://www.e-cell.org, simulates basic book written by architect Christopher
Upper Atmosphere Research Collaboratory, cellular processes. Tomita has just completed Alexander in 1977, in which he proposes
which has united the community’s panoply a model of human erythrocytes and is build­ tackling large problems, like building a city,
of instruments and data around common ing other models of human mitochondria, by building up to them. In this approach, all
research programmes. signal transduction for chemotaxis in the of the smaller-scale models, such as houses
The nature of science is changing, says bacterium Escherichia coli, and gene expres­ and streets, are solved first. Lake predicts
Lake. “Until now, astronomy has been about sion networks in this bacterium’s lactose that this concept of looking at the sorts of
getting telescope time, looking at a point in operon (a collection of genes that are ‘services and interfaces’ between compo­
the sky, getting the data, doing something switched on when the bacterium is forced to nents of complex systems may be a key to
with those data, keeping it proprietary and feed on lactose sugar). Nourished properly, better algorithm design.
publishing a small result; but soon all the the ‘Tamagotchi’ erythrocyte reaches a steady Perhaps the most appealing aspect of
information is going to be available digital­ state where metabolite concentrations com­ distributed power is the prospect of an end
ly.” Disciplines, he reckons, will increasingly pare well with those reported in real mam­ to proprietary software, thanks to the input,
organize large-scale computational malian erythrocytes. Tomita is now inhibit­ not least, of researchers who should no
resources which are automatically called into ing enzymes from the glycolytic pathway in doubt be concentrating on their science. The
play from your PC or workstation. The con­ silico, such as hexokinase, glucose-6-phos­ success of the Linux open-source operating
cept of scientists standing on each others’ phate dehydrogenase, phosphofructokinase system has shown that, when it comes to
shoulders is in for a renaissance. Rather than and pyruvate kinase, in a bid to throw light on debugging codes, companies seem no match
spending most of their time collecting their the cellular metabolism of people suffering for the collective IQ of thousands of devel­
own data, this will be shared, coded and from hereditary anaemia, which is caused by opers on the web.
made accessible to the whole community. deficiencies of these enzymes. Many are pre­ But the Internet and computers are easier
dicting that the fruitfly Drosophila will be the to invent than they are to predict. The biggest
Going for grand challenges main target for complex computer modelling uncertainty is perhaps quantum computing.
If scientists have access to supercomputing in developmental biology. Its genome will In 1999, Yasunobu Nakamura’s team at the
power from their desktops through the soon be available, and many hands make light NEC Fundamental Research Laboratory in
Internet by 2010, it will allow them to take on work; there are some 6,000 Drosophila Tsukuba, Japan, demonstrated electrical
much more ambitious research challenges. researchers out there. control of a quantum data bit (or qubit), a
“I think we will have the kind of computing There is no shortage of petaflop-scale possible first step on the road to the eventual
power and tools to build models of whole problems in biology, from brain research to creation of a solid-state quantum computer.
cells and organisms,” says Van Houweling, the modelling of organs. The principal limit­ Whereas today’s computers store informa­
“but these new tools that we are talking about ing step in the development of advanced tion as ones or zeros, quantum computers
are becoming available only now to ordinary computing in biology is the lack of biologists could use non-classical memory states that
scientists.” who know how to do it. While physicists and incorporate both values simultaneously,
Most biologists at present are happy to astronomers have decades of experience of resulting in an exponential speed-up in com­
look for the homologue of the particular gene expressing mathematically complex prob­ puting time. One small quantum computer
or protein sequences they are interested in. lems and writing the software to carry them would easily outstrip all the computing
But by using more powerful computers, it out, computational biology is in its infancy. power in the world combined. Although
will be possible to start looking at the integra­ So physicists will be able to translate more Dmitri Averin, an expert in quantum
tion of information across the genome. “If we quickly improvements in hardware perfor­ computing at the State University of New
hope to understand biology, instead of look­ mance at the petaflop level into the scale or York, Stony Brook, believes that quantum
ing at one little protein at a time, which is realism of grand challenges. One such computing will not become a practical tool
not how biology works, we will need to challenge is to design a virtual sustained within the next ten years, he predicts
understand the integration of thousands of thermonuclear fusion reactor on a computer nonetheless that it will be a much bigger
proteins in a dynamically changing environ­ without having to build real prototypes, a research area than today, and will yield excit­
ment,” says Venter. This is the direction problem estimated at 1 to 10 petaflops. ing new discoveries in quantum mechanics.
Celera will take next year once it has complet­ But even physicists are now running up Declan Butler is European correspondent at Nature,
ed the sequencing of the human and mouse against the difficulties caused by the sheer 3 rue de l’Arrivée, B.P. 264, 75749 Paris Cedex 15,
genomes. To do so, the company is building complexity of the software needed to model France.

C70 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

The future of evolutionary

developmental biology

Peter W. H. Holland

Combining fields as diverse as comparative embryology, palaeontology,


molecular phylogenetics and genome analysis, the new discipline of
evolutionary developmental biology aims at explaining how developmental
processes and mechanisms become modified during evolution, and how
these modifications produce changes in animal morphology and body plans.
In the next century this should give us far greater mechanistic insight into
how evolution has produced the vast diversity of living organisms, past and
present.

volutionary biology and developmen­ functions in Drosophila development were But by 1989 the evidence for common

E tal biology, or embryology, have had a


stormy relationship over the past
hundred years. At the end of the nineteenth
evolutionarily related, that is, they had all
derived from the same ancestral gene in
some remote and simpler ancestor3–5.
developmental mechanisms was becoming
overwhelming. The two principal complexes
of homeobox genes in Drosophila (the
century and the start of the twentieth, they Even more dramatic was the demonstra­ homeotic genes) were proved to be directly
were inseparable; comparisons of the tion that previously unknown genes in other related to four clusters of homeobox genes in
embryonic development of different species types of animal, including vertebrates, also mammals, and to share with them a similar
were used as evidence for evolution, while possessed the homeobox motif5. With a physical organization and patterns of gene
evolutionary history was seen as sufficient few exceptions, developmental biologists expression. Not only had all these genes
explanation for almost every structure or were surprisingly cautious in their initial derived from the same cluster of genes pre­
process observed in animal development1,2. reaction. Most accepted that the homeobox sent in some remote ancestor, but the fruitfly
But as evolutionary theory embraced might prove a useful tag for discovering and and mammalian gene clusters, the Hox
genetics in the 1920s and 1930s, the study cloning interesting developmental genes, genes, still had a similar and fundamental
of embryonic development was largely but largely shied away from the implied role in their respective organisms — to speci­
rejected as being insufficiently precise or suggestion that the homeobox was high­ fy the identity of different regions along the
quantitative to contribute to an increasingly lighting developmental mechanisms shared head-to-tail axis (the anteroposterior axis)
rigorous science. between organisms as distantly related as of the body5.
The past 15 years have seen a timely fruitflies and humans. Clearly, evolutionary Evolutionary biologists call this retention
reconciliation between the two fields, with and developmental biology were in no rush of structure and function ‘conservation’, and
the vibrant new discipline of evolutionary to form a partnership. other remarkable examples soon followed.
developmental biology emerging at the
interface. This concerns itself with how
developmental processes themselves have
evolved: how they can be modified by genetic
change, and how such modifications
produce the past and present diversity of
morphologies and body plans. Three main
factors have contributed to the emergence
and phenomenal growth of evolutionary
developmental biology. Ironically, all three
depend on genetics — the discipline that
split evolution and development apart 60
years earlier.
Figure 1 ‘Man is but a worm’
Man is but a worm? reproduced from Punch’s
The first factor, and arguably the most Almanac for 1882. This
important, was the discovery that animals as cartoon originally symbolized
different as nematodes, flies and mammals the evolution of man, but is
use similar genes for similar developmental pertinent to the striking
purposes, such as controlling the develop­ conservation of
ment of spatial organization in the embryo. developmental mechanisms
The ball started rolling with the discovery in and genes between vertebrates
1984 of a shared DNA sequence motif — the and bilaterian invertebrates.
homeobox — in a variety of genes that con­ Reproduced with permission
trol development in the fruitfly Drosophila. of Punch Ltd.
This confirmed that genes with distinct
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C41
impacts

The Pax-6 gene turned out to be implicated The limits to conservation evolutionary developmental biology. A
in eye development virtually throughout the Despite the interest of each new example, fundamental question to be addressed is
animal kingdom, and a homeobox gene, simply documenting yet more cases of how alterations to the genotype as a result of
tinman, is involved in heart development in conservation between Drosophila, nema­ mutation are transformed through the
flies and vertebrates. The discovery of con­ todes and human is likely to add little to intermediary of development into changes
servation permits previously impossible our understanding of the actual course of in form. There are several parts to this ques­
comparisons of the development of organ­ evolution. Attention now needs to be divert­ tion. The first is the relative importance of
isms with very different body plans, and has ed to rigorously determining the limits to mutations that alter the coding sequence of a
stimulated developmental biologists to con­ conservation in each case. gene, and thus the structure and function of
sider the evolutionary ancestry of develop- We need, for example, to establish when the protein it encodes, as against regulatory
mental mechanisms, often for the first time. and in what type of organism the conserved mutations that affect the site, timing or level
So many examples of conservation have now gene or gene function first appeared; to iden­ of gene expression. We suspect that regulato­
been found that it is no longer considered tify the secondary modifications or losses ry mutations will be most important,
surprising, and the cartoon in Fig. 1 is even that have occurred; and, if possible, to detect although both categories of mutation have
more appropriate now than it was in the underlying factors that may have con­ been documented in developmentally
1880s. We can now state with confidence that strained or promoted change. The example important genes. Dissecting their individual
most animal phyla possess essentially the of the Hox gene clusters (Box 1) illustrates contributions to developmental change is
same genes, and that some (but not all) of the general problem. complicated by the fact that both types of
these genes change their developmental roles This example also shows the crucial mutation can occur in the same gene.
infrequently in evolution. importance of having a sound phylogeny on The importance of regulatory mutations
which to base comparative genetic and can be seen within the vertebrates, where the
Constructing the tree developmental analyses. As is evident from anterior boundaries of expression of partic-
The second crucial factor was the rise of mol- Fig. 2, knowledge of phylogeny is vital if ular Hox genes have altered significantly in
ecular phylogenetics — the comparison of apparent structural or functional similari­ different animals. These changes correlate
nucleic acid sequences from different organ­ ties are to be interpreted safely as true closely with anatomical changes along the
isms and the construction of evolutionary homologies — similarities due to descent anteroposterior axis, such as the location of
trees from these data. In 1988, Field et al. from the same ancestral source — and the the junction between neck and thorax10,11.
published a pioneering paper that tackled limits to conservation of each homology are Similarly, the areas of expression of Hox
the vast task of constructing the lines of to be defined precisely. genes responsible for specifying thoracic
descent — the phylogeny — of the entire identity are greatly expanded in python
animal kingdom using comparisons of Genotype into phenotype embryos compared with other vertebrates,
ribosomal RNA sequences6. At the time, The link between the genetic make-up of an which mirrors the anatomical extension of
invertebrate zoologists were beginning to organism (its genotype) and its form and thoracic identity along most of the vertebral
appreciate the extent of convergent evolu­ function (its phenotype) lies at the heart of column in snakes12.
tion (that is, when animals with quite differ­
ent evolutionary histories have similar
features), and were thus casting doubt on
traditional anatomy-based phylogenetic Box 1 How conserved are Hox genes?
schemes7. Molecular phylogenetics, on the
other hand, seemed to provide an objective One outstanding question is whether clustered Hox genes specifying regional identity along the
method of assessing evolutionary relation­ anteroposterior axis are present in all multicellular animals and, indeed, whether their origin marks the
ships. Since 1988, methods of DNA sequence origins of the animals themselves. Hox gene clusters have been found so far in chordates (the phylum to
analysis have been improved, the range which vertebrates belong), echinoderms (the starfish, sea urchins and their relatives), arthropods (crustacea
of species sampled has increased, some and insects), nemertean worms and nematodes. If we map this distribution onto the working framework of
potential sources of artefact removed8, and animal evolution provided by recent molecular phylogenies8,9, we can infer that a Hox gene cluster existed in
complementary molecular data added9. The the last common ancestor of all extant bilaterians (animals with (usually) bilateral symmetry and embryos
current molecular-based view of inverte­ with three germ layers) and is designated B in Fig. 2. This organism was thus the ancestor of three major
brate relationships certainly lacks resolu­ groupings of animals –– the deuterostomes (for example, chordates and echinoderms), ecdysozoans (for
tion, but at least it provides a framework example, arthropods) and lophotrochozoans (for example, molluscs and annelid worms). We can also safely
within which comparative developmental assume that all phyla descended from this common ancestor possess (or possessed) a Hox gene cluster.
data can begin to be interpreted (Fig. 2). We cannot, however, draw the same conclusion for animal lineages that split off earlier than this
Technical advances in molecular biology common ancestor, notably the cnidarians (corals, sea anemones and jellyfish), ctenophores (comb jellies),
provided the third impetus to evolutionary sponges and placozoans. Genes described as Hox genes on the basis of their DNA sequence have been
developmental biology. Low-stringency reported from each of these phyla, but physical clustering has not been demonstrated. Furthermore, the
library screening, the polymerase chain reac­ discovery in bilaterians of ancient clustered genes similar in DNA sequence to Hox genes but with different
tion and in situ hybridization were each roles implies that it is inappropriate to designate a gene as Hox on the basis of sequence alone22. Hence, we
invented or refined in the 1980s, facilitating do not yet know if a Hox gene cluster was present in the common ancestor of all animals (designated A in
the cloning and analysis of genes in any Fig. 2), or indeed even earlier.
species, not just the handful of model species We also know relatively little about the extent to which the Hox gene cluster has been secondarily
traditionally studied. modified in different lineages. There are examples of rapid sequence change in some arthropod Hox genes,
In the rest of this article, I outline several cluster breakage in Drosophila and nematodes9, and loss of some Hox genes in pufferfish28 and possibly
directions in which I believe there is a real barnacles29. Gene loss during evolution is poorly documented but could be common; analysis of
chance of significant advance in the future. developmental genes in parasitic species (which tend to lose functions compared with their non-parasitic
This list is certainly not comprehensive, relatives) could be useful in this regard. Until the origin of the Hox genes is established and the pattern of
partly because of my own biases, but also their subsequent divergence is clarified, we can no longer be satisfied with statements that simply describe
because this area of science is renowned for Hox gene clusters as ‘conserved’.
throwing up surprises.
C42 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

The mutations responsible for these

CORBIS/G. BALAVOINE/B. OKAMURA


alterations have not yet been identified; they Bilaterians (triploblasts) Diploblasts
could have occurred in the regulatory
sequences of the Hox genes themselves, or in
other genes that control the activation or
stabilization of Hox gene expression.
DNA sequence analysis of many more
species, chosen with regard to their phyloge­ Ecdysozoans Lophotrochozoans Deuterostomes
netic positions, will be the key to finding
candidate mutations, whose effects can then
be investigated experimentally. Indeed, such

Cephalochordates
a comparative approach was used recently to

Nematomorphs

Hemichordates
identify a mutation in an enhancer of the

Onycophorans

Platyhelminths

Urochordates
Echinoderms

Ctenophores
Brachiopods
Kinorhynchs

Sipunculans
Nemerteans
Hoxc-8 gene of baleen whales13.

Nematodes

Vertebrates
Arthropods

Cnidarians
Bryozoans
Phoronids
Priapulids

Echiurans

Placozoa
Molluscs

Annelids

Porifera
How important is gene duplication?
A second question in need of resolution is
the importance of gene duplication in the
evolution of development. Are there devel­ C
opmental processes that are possible with
two copies of a gene but not with one? This is
a controversial suggestion, not least because
it implies that gene number could impose
tight genetic constraints on evolution. But B
although gene duplications may have been A
very important in some lineages, as dis­
cussed below for the vertebrates, it seems
that they are neither necessary nor sufficient Figure 2 Proposed phylogeny of the animal kingdom, based primarily on 18S rDNA sequences6,8 and
for most developmental evolution. Hox gene cluster composition9. All animals can be divided into two basic groups according to the
Vertebrates, however, are an interesting number of layers in the embryonic body wall: the bilaterians (triploblasts) have three, and the
case. They have many more Hox genes than diploblasts have two. The names at the tips of the branches indicate the main animal phyla,
invertebrates, as a result of the duplication taxonomic groupings of animals that share the same body plan. The urochordates, cephalochordates
and reduplication of a single ancestral Hox and vertebrates comprise the chordate phylum. Several minor phyla are omitted. More than one
gene cluster. This was accompanied by an branch leading from a split (for example at the base of the ecdysozoans or lophotrochozoans)
expansion of many other developmentally indicates lack of resolution or lack of consensus given the available data. Branches labelled A, B, C
important gene families14 and, indeed, the refer to the origin of animals, of bilaterians and of vertebrates, respectively; see text for discussion.
total number of genes in the genome15 (C in
Fig. 2 shows when this occurred). It is tempt­
ing to speculate that the origin of the complex may retain the shared ancestral role, but it is now clear that we need not think of
vertebrate body plan, with its novel cell types supplement this with new roles. mutations in developmental control genes
and organ systems, was made possible by the A very similar picture of duplicate gene (such as Hox genes) as always causing large
availability of extra genetic raw material, in expression could, however, be achieved in a phenotypic changes. For example, small dif­
particular, interacting sets of genes that could totally different way. If each duplicate gene ferences in trichome patterns on the second
be gradually recruited for new developmen­ loses some of its suite of original functions, leg of different Drosophila species seem to be
tal roles. This hypothesis predicts a tendency control of a complex developmental process the result of subtle differences in the regula­
for vertebrate genomes to retain duplicates of would become divided between the multiple tion of the Hox gene Ubx17. We need to deter­
developmentally important genes but to lose descendants of the original gene16. The mine whether numerous mutations of very
duplicates of the housekeeping genes that relative importance of these different modes small phenotypic effect or fewer mutations
control routine metabolic functions, and of evolution needs elucidating, and not only of larger effect generally contribute to mor­
predicts that the retained genes would tend to with respect to vertebrates. phological differences between species18.
acquire new roles in cell types unique to Second, it is now clear that populations
vertebrates. By testing such predictions, it A new ‘New Synthesis’ can harbour extensive genetic variation with
should be possible to determine whether In all the scenarios outlined above, the the potential to cause morphological change
gene duplication was positively exploited in creation of genetic variation is seen as funda­ but which is only revealed under particular
early vertebrate evolution. mentally important to the rate, timing or conditions. For example, perturbation of the
The mechanisms by which duplicate pattern of evolution. At first sight, this view­ Drosophila heat-shock protein Hsp90 by
genes acquire new roles also needs resolving. point does not sit comfortably with the mutation or changes in environmental con­
The simplest model suggests that one copy neodarwinian ‘New Synthesis’, which places ditions uncovers phenotypic variation gen­
of a duplicate gene retains the ancestral role, more emphasis on the reduction of genetic erated by otherwise hidden genetic
while additional copies are free to accumu­ variation during evolution as a population variation19.
late mutations and diversify. Current becomes increasingly adapted to its environ­ These findings may point the way towards
evidence indicates that this is an over­ ment and the desirable genes that bring this a logical framework for the ‘microevolution’
simplification. Members of families of dupli­ about spread throughout the population. If of development — the generation of small
cated developmental genes in vertebrates developmental biology is to be fully integrat­ genetically determined differences in devel­
often show overlapping spatial and temporal ed with evolutionary biology this conceptual opment that lead to the relatively minor vari­
patterns of expression, and also overlapping gap needs to be bridged. ations in morphology. We can assume that
functions. This suggests that duplicate genes Some steps have already been taken. First, mutations of small phenotypic consequence
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C43
impacts

arise frequently, and are either exposed genetic or external environmental triggers ous examples to add include a cnidarian, a
immediately to selection pressures or for animal diversification. Several lines of lophotrochozoan (for example a mollusc or
sheltered temporarily in cryptic form by evidence need to be combined to move the annelid worm), an echinoderm, and a
interactions with other genes. Mutations debate forward. The continued study of urochordate (a tunicate) or cephalochordate
with larger phenotypic effects can also arise, Precambrian (Vendian) fossils is clearly (amphioxus).
but it is unclear how frequently they important. If any of these can convincingly Complete genomes provide more than
contribute to morphological evolution. An be shown to be allied to extant bilaterian simply a catalogue of genes. For example,
extreme view is that the largest-effect muta­ phyla, this would cast doubt on the idea of because of the roles of chromatin structure
tions serve only to stabilize morphological the rapid diversification of bilaterian lin­ and nuclear architecture in gene
change already produced by the gradual eages in the Cambrian. Further analysis of regulation27, neighbouring genes could be
accumulation of small changes20. developmental control genes in cnidarians subject to coordinated regulation. We might
The huge challenge for the future is to and ctenophores will also be useful, as this then expect the position of a gene on a chro­
convert this conceptual framework into a might reveal whether bilaterians indeed have mosome to be functionally relevant and
quantitative model, with parameters such as unique developmental characteristics21,22. conserved in some cases. Once several com­
magnitude of phenotypic effect (and its Another major contribution of palaeon­ plete genomes are available the hypothesis of
heterogeneity), number of genes involved, tology is in the recognition of ancestral conserved gene position can be tested. It
mutation rates, effects of genetic recombina­ character states and extinct character combi­ seems quite possible that the correspon­
tion, gene additivity and effects of the nations. These can help us deduce the actual dence between a Hox gene’s position in a
environment. It will then be important to path of developmental evolution, given the gene cluster and its expression along the
examine the model’s behaviour in relation to genetic and developmental data at our anteroposterior axis, so fundamental to
variables such as population size or degree disposal from extant organisms. Our under­ patterning the bilaterian body plan, may be
of fragmentation, selection pressure and standing of the early evolution of just the tip of an iceberg.
genetic drift. vertebrates23, the radiation of the lophotro­ Peter W. H. Holland is at the School of Animal and
It remains to be seen whether the lessons chozoans24 and the evolution of the tetrapod Microbial Sciences, University of Reading,
learnt from the study of microevolution will limb25 have all benefited from such data. Whiteknights, PO Box 228, Reading RG6 6AJ, UK.
be sufficient to explain the much greater Evolutionary developmental biology will e-mail: p.w.h.holland@reading.ac.uk
morphological and physiological differences continue to benefit from fossil evidence, 1. Darwin, C. The Origin of Species by Means of Natural Selection
between higher taxa such as phyla. I suspect provided that the science of palaeontology (John Murray, London, 1859).
2. Haeckel, E. The Evolution of Man: A Popular Exposition of the
that radical alterations to genetic systems (for continues to be widely appreciated. Principal Points of Human Ontogeny and Phylogeny (Appleton,
example, duplication of the whole genome or New York, 1896).
major alterations in the mechanism of gene Embracing genomics 3. McGinnis, W., Levine, M. S., Hafen, E., Kuroiwa, A. & Gehring,
regulation) will need to be included if we are The sequencing of complete genomes from W. J. Nature 308, 428–433 (1984).
4. Scott, M. P. & Weiner, A. J. Proc. Natl Acad. Sci. USA 81,
to explain some truly major transitions, such multicellular organisms promises to revolu­ 4115–4119 (1984).
as the origin of multiple germ layers in the tionize the biological sciences. What are the 5. McGinnis, W. Genetics 137, 607–611 (1994).
ancestor of the bilaterians (B in Fig. 2) or the implications for developmental biology? 6. Field, K. G. et al. Science 239, 748–753 (1988).
7. Willmer, P. G. Invertebrate Relationships: Patterns in Animal
emergence of the vertebrates. Animal developmental biologists will proba­
Evolution (Cambridge Univ. Press, 1990).
bly have to be content in the foreseeable 8. Aguinaldo, M. A. et al. Nature 387, 489–493 (1997).
Vermes of the Vendian future with a nematode or two, a couple of 9. de Rosa, R. et al. Nature 399, 772–776 (1999).
Fifteen years ago, few developmental biolo­ insects, human, mouse and two rather 10. Burke, A. C., Nelson, C. E., Morgan, B. A. & Tabin, C.
Development 121, 333-346 (1995).
gists would have heard of the Ediacaran advanced fish. The opportunities will still be 11. Gaunt, S. J., Dean, W., Sang, H. & Burton, R. D. Mech. Dev. 82,
fauna, and few palaeontologists would have huge. The complete genome sequence of the 109–118 (1999).
confessed to an interest in Drosophila genet­ nematode Caenorhabditis elegans has already 12. Cohn, M. J. & Tickle, C. Nature 399, 474–479 (1999).
ics. Much has changed. Palaeontologists and yielded surprises, including some previously 13. Shashikant, C. S. et al. Proc. Natl Acad. Sci. USA 95,
15446–15451 (1998).
developmental biologists are now regularly undiscovered Hox genes, secondary loss of 14. Holland, P. W. H. & Garcia-Fernàndez, J. Dev. Biol. 173,
combining data to tackle key questions in the Hedgehog signalling molecule and one of 382–395 (1996).
evolutionary developmental biology. Con­ its receptor components, and an exceptional­ 15. Simmen, M. W., Leitgeb, S., Clark, V. H., Jones, S. J. M. & Bird,
sider timescales. Palaeontology is vital to ly large number of genes for steroid-hormone A. Proc. Natl Acad. Sci. USA 95, 4437–4440 (1997).
16. Force, A., Lynch, M., Pickett, F. B., Amores, A., Yan, Y. L. &
estimating when a particular evolutionary receptors9,26. As each genome is sequenced, it Postlethwait, J. Genetics 151, 1531–1545 (1999).
change occurred; it can also reveal whether will yield its own set of lineage-specific 17. Stern, D. L. Nature 396, 463–466 (1998).
such a change was correlated with other expansions and reductions within families of 18. Mackay, T. F. C. BioEssays 18, 113–121 (1996).
19. Rutherford, S. L. & Lindquist, S. Nature 396, 336–342 (1998).
evolutionary events or with environmental homologous genes. Eventually, a crude but
20. Budd, G. E. BioEssays 21, 326–332 (1999).
change. The Cambrian explosion of animal complex picture should emerge of the general 21. De Robertis, E. M. & Sasai, Y. Nature 380, 37–40 (1996).
phyla is the classic example, but remains con­ pathways of genome diversification during 22. Brooke, N. M., Garcia-Fernàndez, J. & Holland, P. W. H. Nature
troversial. Palaeontology clearly records a evolution. It will then be a major task to 392, 920–922 (1998).
23. Aldridge, R. J. & Purnell, M. A. Trends Ecol. Evol. 11, 463–468
rapid increase in the abundance and diversi­ determine which genomic changes relate to (1996).
ty of animal fossils at the base of the Cambri­ modifications in developmental control or 24. Conway Morris, S. & Peel, J. S. Phil. Trans. R. Soc. Lond. B 347,
an, but disagreement abounds as to whether morphology. The pattern of differences 305–358 (1995).
this reflects increases in body size, the origin should, however, give immediate clues as to 25. Coates, M. I. Development (Suppl.) 169–180 (1994).
26. Ruvkun, G. & Hobert, O. Science 282, 2033–2041 (1998).
of shells and skeletons, or a true rapid diver­ which sorts of gene families, or genetic path­ 27. Cockell, M. & Gasser, S. M. Curr. Opin. Gen. Dev. 9, 199–205
sification of body plans. Even if the last ways, are prone to change, and which may be (1999).
explanation is accepted, did this burst of evolutionarily constrained. 28. Aparicio S. et al. Nature Genet. 16, 79–83 (1997).
evolution occur in the ancestors of all multi­ From the perspective of understanding 29. Mouchel-Vielh, E., Rigolot, C., Gibert, J.-M. & Deutsch, J. S.
Mol. Phylogenet. Evol. 9, 382–389 (1998).
cellular animals, or just among the bilaterian how animal body plans evolve, it is unfortu­
lineage (A versus B in Fig. 2)? nate that the genome sequences most likely Acknowledgements. I thank M. Cohn, S. Shimeld and A. Holland
for comments on the manuscript, B. Okamura for the bryozoan
Resolving these questions is important, to be completed first do not encompass a photograph in Fig. 2, and B. Cohen, M. Telford and other
not least because it would indicate where particularly wide range of the body plans colleagues for helpful discussions. I hope that non-animal
(and whether) to search for possible internal present within the animal kingdom. Obvi­ biologists will excuse my zoocentric selection of examples.

C44 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Feeding the world


in the twenty-first century
Gordon Conway and Gary Toenniessen

The gains in food production provided by the Green Revolution have


reached their ceiling while world population continues to rise. To ensure that
the world’s poorest people do not still go hungry in the twenty-first century,
advances in plant biotechnology must be deployed for their benefit by a
strong public-sector agricultural research effort.
he Green Revolution was one of the As well as gross undernourishment, lack

T great technological success stories of of protein, vitamins, minerals and other


90 1965–74

Average increase in
80 1975–84
the second half of the twentieth centu­ micronutrients in the diet is also wide­

yield (kg Ha–1)


70 1985–94
ry. Because of the introduction of scientifi­ spread3. About 100 million children under 60
50
cally bred, higher-yielding varieties of rice, five suffer from vitamin A deficiency, which 40
wheat and maize beginning in the 1960s, can lead to eye damage. Half a million chil­ 30
overall food production in the developing dren become partly or totally blind each year, 20
countries kept pace with population growth, and many subsequently die. Recent research 10
0
with both more than doubling. The benefits has shown that lack of vitamin A has an even Rice Wheat Maize
of the Green Revolution reached many of the more pervasive effect, weakening the protec­
world’s poorest people. Forty years ago there tive barriers to infection put up by the skin, Figure 1 Average annual increase in yields of
were a billion people in developing countries the mucous membranes and the immune rice, wheat and maize in developing countries
who did not get enough to eat, equivalent to system4. Iron deficiency is also common, by periods.
50 per cent of the population of these coun­ leading to about 400 million women of
tries. If this proportion had remained childbearing age (15–49 years) being afflict­
unchanged, the hungry would now number ed by anaemia. As a result they tend to pro­ who lack access to adequate food today will
over two billion — more than double the duce stillborn or underweight children and be any better served by future world markets.
current estimate of around 800 million, or are more likely to die in childbirth. Anaemia Food aid programmes are also no solution,
around 20 per cent of the present population has been identified as a contributing factor in except in cases of specific short-term emer­
of the developing world. Since the 1970s, over 20 per cent of all maternal deaths after gency. They reach only a small portion of
world food prices have declined in real terms childbirth in Asia and Africa. those suffering chronic hunger and, if
by over 70 per cent. Those who benefit most If nothing new is done, the number of the prolonged, create dependency and have a
are the poor, who spend the highest propor­ poor and hungry will grow. The populations negative impact on local food production.
tion of their family income on food. of most developing countries are increasing About 130 million of the poorest 20 per
The Green Revolution brought benefits rapidly and by the year 2020 there will be cent of people in developing countries live in
too for the industrialized world. The high­ an additional 1.5 billion mouths to feed, cities. For them, access to food means cheap
yielding varieties of staple crop plants bred by mostly in the developing world. What is the food from any source. But 650 million of the
the international agricultural research cen­ likelihood that they will be fed? poorest live in rural areas where agriculture
tres of the CGIAR (the Consultative Group is the primary economic activity, and as is the
on International Agricultural Research) have The end of the Green Revolution case in much of Africa, many live in regions
been incorporated into the modern varieties The prognosis is not good. As indicated in where agricultural potential is low and nat­
grown in the United States and Europe. The Fig. 1, there is widespread evidence of decline ural resources are poor8. They are distant
additional wheat and rice produced in the in the rate of increase of crop yields5–7. This from markets and have limited purchasing
United States alone from these improved slowdown is due to a combination of causes. power. For them, access means local produc­
varieties is estimated to have been worth over On the best lands many farmers are now tion of food that generates employment and
$3.4 billion from 1970 to 1993 (ref. 1). obtaining yields close to those produced on income, and is sufficient and dependable
Yet today, despite these demonstrable experimental stations, and there has been enough to meet local needs throughout the
achievements, over 800 million people con­ little or no increase in the maximum possible year, including years that are unfavourable
sume less than 2,000 calories a day, live a life yields of rice and maize in recent years. A for agriculture.
of permanent or intermittent hunger and are second factor is the cumulative effect of All these arguments point to the need for a
chronically undernourished2. Most of the environmental degradation, partly caused second Green Revolution, yet one that does
hungry are the women and young children of by agriculture itself. not simply reflect the successes, and mistakes,
extremely poor families in developing coun­ Simply exporting more food from the of the first. In effect, we require a ‘Doubly
tries. More than 180 million children under industrialized countries is not a solution. Green Revolution’, an agricultural revolution
five years of age are severely underweight: The world already produces more than that is both more productive and more ‘green’
that is, they are more than two standard devi­ enough food to feed everyone if the food in terms of conserving natural resources and
ations below the standard weight for their were equally distributed, but it is not. Market the environment than the first. We believe
age. Seventeen million children under five economies are notoriously ineffective in that this can be achieved by a combination
die each year and malnourishment con­ achieving equitable distribution of benefits. of: ecological approaches to sustainable agri­
tributes to at least a third of these deaths. There is no reason to believe that the poor culture; greater participation by farmers in
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C55
impacts

Table 1 Biotechnology research useful in developing countries


Traits now in greenhouse or field tests
s Traits now in laboratory tests
Input traits Input traits
Resistance to insects, nematodes, viruses, bacteria and fungi in crops Drought and salinity tolerance in cereals
such as rice, maize, potato, papaya and sweet potato
Seedling vigour in rice
Delayed senescence, dwarfing, reduced shade avoidance and
early flowering in rice Enhanced phosphorus and nitrogen uptake in rice and maize

Tolerance of aluminium, submergence, chilling and freezing in cereals Resistance to the parasitic weed Striga in maize, rice and sorghum, to
viruses in cassava and banana, and to bacterial blight in cassava
Male sterility/restorer for hybrid seed production in rice, maize, oil-seed
rape and wheat Nematode resistance and resistance to the disease black sigatoka in banana

New plant types for weed control and for increased yield potential in rice Rice with the alternative C4 photosynthetic pathway and the ability to
carry out nitrogen fixation

Output traits Output traits


Increased �-carotene in rice and oil-seed rape Increased �-carotene, delayed post-harvest deterioration and reduced content
of toxic cyanides in cassava
Lower phytates in maize and rice to increase bioavailable iron
Increased vitamin E in rice
Modified starch in rice, potato and maize and modified fatty-acid content
in oil-seed rape Apomixis (asexual seed production) in maize, rice, millet and cassava

Increased bioavailable protein, essential amino acids, seed weight and Delayed ripening in banana
sugar content in maize
Use of genetically engineered plants such as potato and banana as vehicles
Lowered lignin content of forage crops for production and delivery of recombinant vaccines to humans

Improved amino-acid content of forage crops

agricultural analysis, design and research; insecticides and herbicides. However, as which are the most important constraint on
and the application of modern biotechnolo­ with many agricultural inputs, the benefits crop production in Africa; Fig. 2a) but
gy directed towards the needs of the poor in received by farmers vary from year to year. becomes more like Asian rice as it reaches
developing countries, which is the subject of Most of the GM crops currently being maturity, thus giving higher yields with few
the rest of this article. grown in developing countries are cash crops; inputs. Marker-aided selection is being used
Bt cotton, for example, has reportedly been to breed rice containing two or more genes
The price of biotechnology taken up by over a million farmers in China. for resistance to the same pathogen, thereby
The application of advances in plant breed­ But despite claims to be ‘feeding the world’, increasing the durability of the resistance,
ing — including tissue culture, marker­ the big life-science companies have little and to accumulate several different genes
aided selection (which uses DNA technology interest in poor farmers’ food crops, because contributing to drought tolerance.
to detect the transmission of a desired gene to the returns are too low. National govern­
a seedling arising from a cross) and genetic ments, the international research centres of Potential of genetic engineering
engineering — are going to be essential if the CGIAR, and a variety of western donors For some time to come, tissue culture and
farmers’ yields and yield ceilings are to be are, and will continue to be, the primary sup­ marker-aided selection are likely to be the
raised, excessive pesticide use reduced, the porters of work that produces advances in most productive uses of biotechnology for
nutrient value of basic foods increased and biotechnology useful to poor farmers. New cereal breeding. However, progress is being
farmers on less favoured lands provided with forms of public–private collaboration could made in the production of transgenic crops
varieties better able to tolerate drought, help to ensure that all farmers and consumers for the developing countries. As in the
salinity and lack of soil nutrients. benefit from the genetic revolution and, over industrialized countries, the focus has been
In the industrialized countries the new time, this should increase the number of largely on traits for disease and pest resis­
life-science companies, notably the big six farmers who can afford to buy new seeds tance, but genes that confer tolerance of
multinationals — Astra-Zeneca, Aventis, from the private sector. high concentrations of aluminium (found
Dow, Dupont, Monsanto and Novartis — The cost of accomplishing this will not be in many tropical soils) have been added
dominate the application of biotechnology insignificant but it should not be excessive. by Mexican scientists to rice and maize
to agriculture. In 1998, ‘genetically modified For example, over the past 15 years, the (Fig. 2c), and Indian scientists have added
(GM)’ crops, more accurately referred to as Rockefeller Foundation has funded some two genes to rice which may help the plant
transgenic or genetically engineered crops, US$100 million of rice biotechnology tolerate prolonged submergence. There is
mostly marketed by these companies or their research and trained over 400 scientists from also the possibility of increasing yield
subsidiaries, were grown on nearly 29 Asia, Africa and Latin America. In several ceilings, through more efficient photosyn­
million hectares worldwide (excluding places in Asia there is now a critical mass of thesis, for example, or by improved control
China)9. That year, 40 per cent of all cotton, talented scientists who are applying the new of water loss from leaves through regulation
35 per cent of soya beans and 25 per cent of tools of biotechnology to rice improvement. of stomatal opening and closing10.
maize grown in the United States were GM To date, most of the new varieties are the In addition to generating new traits that
varieties. result of tissue culture and marker-aided enable the plant to grow better (input traits),
So far, the great majority of the commer­ selection techniques. For example, scientists which are useful to poor farmers, GM tech­
cial applications of plant genetic engineering at the West Africa Rice Development Associ­ nology can also generate plants with
have been for crops with single-gene ation have used anther culture to cross the improved nutritional features (output
alterations that confer agronomic benefits high-yielding Asian rices with traditional traits) of benefit to poor consumers. One of
such as resistance to pests or to herbicides. African rices. The result is a new plant type the most exciting developments so far has
These agronomic traits can reduce costs to that looks like African rice during its early been the introduction of genes into rice that
the farmer by minimizing applications of stages of growth (it is able to shade out weeds, result in the production of the vitamin A
C56 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts
precursor �-carotene in the rice grain11. �­ replant new varieties from the public sector. proprietary seed and restrict its use for
carotene is a pigment required for photosyn­ But if the companies tie up enabling tech­ further breeding, the public sector will be
thesis and is synthesized in the green tissues nologies and DNA sequences of important severely constrained in using biotechnology
of all plants, including rice, but is not usually genes with patents, and then use terminator to meet the needs of the poor.
present in non-photosynthetic tissues such technologies to control the distribution of Rather than using the terminator tech­
as those of seeds. Traditional plant breeding nology to protect their intellectual property
has given us some plants that produce �­ in developing countries, it would be better if
a
carotene in non-photosynthetic tissue, such seed companies focused on producing
as the roots of carrots, but despite decades of hybrid seed combined with plant variety
searching no rice mutants had been found protection (PVP) to protect the commercial
that produce �-carotene in the grain, so con­ production of the seed. Hybrid plants do
ventional breeding was not an option. To get produce viable seed but it is not genetically
the cells of the grain to produce �-carotene, identical to the original hybrid seed; it may
genetic engineers added three genes for key lack some of the desirable characteristics.
enzymes for �-carotene biosynthesis to the Hence, there is still an incentive (for exam­
rice genome. The grain of the transgenic rice ple, increased yield) for farmers to purchase
has a light golden-yellow colour (Fig. 2b) hybrid seed for each planting. However, if
and contains sufficient �-carotene to meet such purchase is not possible, farmers can
human vitamin A requirements from rice still use a portion of their harvest as seed and
alone. This ‘golden’ rice offers an opportuni­ b obtain a reasonable crop. Such recycling of
ty to complement vitamin A supplementa­ hybrids is not uncommon in developing
tion programmes, particularly in rural areas countries and is an important element of
that are difficult to reach. These same scien­ food security. And with PVP, new varieties
tists and others have also added genes to rice can be protected while also becoming a
that increase the grain’s nutritionally avail­ resource that both the private and public
able iron content by more than threefold. sectors can use in further breeding for the
Over the next decade we are likely to see benefit of all farmers.
much greater progress in multiple gene
introductions that focus on output traits or Intellectual property rights
on difficult-to-achieve input characteristics Even assuming that terminator technologies
(Table 1). c are not used, there is cause for concern about
The potential benefits of plant biotech­ the rights of developing countries to use their
nology are considerable, but are unlikely to own genetic resources, the freedom of their
be realized unless seeds are provided free or plant breeders to use new technologies to
at nominal cost. This will require heavy pub­ develop locally adapted varieties, and the
lic investment by national governments and protection of poor farmers from exploita­
donors, at times in collaboration with the tion. In part, these concerns result from the
private sector, both in the research and in the privatization of crop genetic improvement,
subsequent distribution of seed and techni­ the rapid expansion of corporate ownership
cal advice. Breeding programmes will also of key technologies and genetic information
need to include crops such as cassava, upland and materials, and the competitive pressure
rice, African maize, sorghum and millet, on these companies to capture world market
which are the food staples and provide share as rapidly as possible
employment for the 650 million rural poor It is only recently that intellectual proper­
who need greater stability and reliability of ty rights (IPR) have become an important
yield as much as increased yield. factor in plant breeding, primarily through
the greater use of utility patents. Such patents
The role of the public sector have stimulated greater investment in crop
None of this will happen through marketing improvement research in industrialized
by multinational seed companies, particu­ countries, but they are also creating major
larly if they decide to deploy gene-protection problems and potentially significant addi­
technologies, commonly referred to as ter­ tional expense for the already financially con­
minator gene technologies, which will mean strained public-sector breeding programmes
that farmers cannot save seed from the crop that produce seeds for poor farmers.
and sow it to get the next crop. In developing Figure 2 Biotechnology products of value in The success of the Green Revolution was
countries roughly 1.4 billion farmers still developing countries. a, Interspecific progenies based on international collaboration which
rely on saving seed for their planting materi­ of Asian � African rice. Rows 3 and 4 from the included the free exchange of genetic diversi­
als and many gain access to new varieties right have vigorous growth with droopy lower ty and information. Most of the ‘added value’
through farmer-to-farmer trade. Much of leaves that suppress weeds. (Courtesy of M. present in modern crops has been accumu­
the success of the Green Revolution was due Jones.) b, Transgenic rice grain containing �­ lated over the centuries by farmers them­
to the true-breeding nature of the higher­ carotene (provitamin A). (Courtesy of I. selves as they selected their best plants as the
yielding rice and wheat varieties. Potrykus and P. Beyer.) c, Transgenic rice (left) source of seed for the next planting. These
While terminator technology is clearly with increased citrate production and exudation ‘land races’ have traditionally been provided
designed to prevent rather than encourage by roots, tolerates aluminium (100 �M at pH free of charge by developing countries to the
such spread of proprietary varieties among 4.5) better than control. (Courtesy of L. Herrera world community. The CGIAR centres add
poor farmers, some argue that it will do them Estrella.) value through selective breeding, and the
no harm because they can still use and superior varieties they generate are widely
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C57
impacts

distributed without charge, benefiting both rice or Thailand’s Jasmine rice. The granting talking and reaching decisions is required.
developing and developed countries. of free licenses to use such materials in We believe a global public dialogue is needed
Patents on biotechnology methods and breeding programmes in the country of which will involve everyone on an equal
materials, and even on plant varieties, are origin of the trait might gain the apprecia­ footing — the seed companies, consumer
complicating and undermining these collab­ tion of developing country researchers and groups, environmental groups, indepen­
orative relationships. Public-sector research governments. dent scientists, and representatives of gov­
institutions in industrialized countries no Finally, the current opposition to GM ernments, particularly from the developing
longer fully share new information and tech­ crops and foods is likely to spread from nations.
nology. Rather, they patent and license and Europe to the developing countries and Agriculture in the twenty-first century
have special offices charged with maximiz­ maybe even to North America unless there is will need to be more productive and less
ing their financial return from licensing. greater public reassurance. At the heart of the damaging to the environment than agricul­
Commercial production of any genetically debate about the safety of GM crops and ture has been in the twentieth. An increased
engineered crop variety requires dozens of their food derivatives is the issue of relative effort is needed to assure that the benefits of
patents and licenses. It is only the big compa­ benefits and risks. The debate is particularly agricultural research reach the hundreds of
nies that can afford to put together the IPR impassioned in Europe. Some of it is moti­ millions of poor farmers who have benefited
portfolios necessary to give them the free­ vated by anti-corporate or anti-American little from previous research. We believe that
dom to operate. And now, under the TRIPS sentiment, but underlying the rhetoric are biotechnology has significant potential to
(Trade-Related Aspects of Intellectual Prop­ genuine concerns about lack of consumer help meet these objectives but that this
erty Rights) agreement of the World Trade benefits, about ethics, about the environ­ potential is threatened by a polarized debate
Organization, most developing countries ment and about the potential impact on that grows increasingly acrimonious. We
are required to put in place their own IPR human health12–16. need to reverse this trend, to begin working
systems, including IPR for plants. Further­ Much of the opposition tends to lump together, to share our various concerns, and
more, all of this ‘ownership’ of plant genetic together the various risks — some real, some to assure the new technologies are applied to
resources is causing developing countries to imaginary — and to assume there are generic agriculture only when this can be done safely
rethink their policies concerning access to hazards17. However, GM organisms are not and effectively in helping to achieve future
the national biodiversity they control, and all the same and each provides different food security for our world.
new restrictions are likely. potential benefits to different people and dif­ Note added in proof: We commend the
So far, international negotiations rele­ ferent environmental and health risks. Calls Monsanto Company’s recent public
vant to agricultural biotechnology and plant for general moratoria are not appropriate. commitment not to commercialize sterile
genetic resources have not been effectively Each new transgene and each new GM crop seed technologies and encourage other
coordinated. There are inconsistencies, and containing it needs to be considered in its companies to follow their lead.
the interests of poor farmers in developing own right. Well planned field tests are cru­ Gordon Conway and Gary Toenniessen are at the
countries have not been well represented. cial, particularly in the developing countries Rockefeller Foundation, New York, New York
The days of unencumbered free exchange of where the risks of using, or not using, a GM 10018, USA.
plant genetic materials are no doubt over, crop may be quite different from those in 1. Pardey, P. G. Alston, J. M., Christian, J. E. & Fan, S. Summary of
and agreements and procedures need to be industrialized countries. a Productive Partnership: The Benefits from U.S. Participation in
the CGIAR (International Food Policy Research Institute,
formulated to ensure that public-sector The multinational companies could take Washington DC, 1996).
institutions have access to the technological a number of specific decisions in this area 2. Conway, G. R. The Doubly Green Revolution: Food for All in the
and genetic resources needed to produce that would improve acceptance of plant 21st Century (Penguin Books, London/Cornell University
Press, Ithaca NY, 1999).
improved crop varieties for farmers in devel­ biotechnology in both the developing and
3. UNICEF. The State of the World’s Children 1998 (Oxford Univ.
oping countries who will not be well served the industrialized world. First, consumers Press, Oxford/New York, 1998).
by the for-profit sector. If the big life-science have a right to choose whether to eat GM 4. Somer, A. & West, K. P. Vitamin A Deficiency: Health, Survival
companies wish to find a receptive and grow­ foods or not and although there are serious and Vision (Oxford Univ. Press, New York and Oxford, 1966).
5. Mann, C. C. Science 283, 310–314 (1999).
ing market in developing countries, they will logistic problems in separating crops all the 6. Cassman. K. G. Proc. Natl Acad. Sci. USA 96, 5952–5959 (1999).
need to work with the public sector to make way from field to retail sale, the agricultural 7. Pingali, P. L. & Heisey, P. W. Cereal Productivity in Developing
sure this happens. seed industry should come out immediately Countries: Past Trends and Future Prospects. CIMMYT
and strongly in favour of labelling. Second, Economics Paper 99-03 (CIMMYT, Mexico, 1999).
8. Leonard, H. J. in Environment and the Poor: Development
Some solutions the industry should disavow use of the termi­ Strategies for a Common Agenda (ed. Leonard, H. J.) 3–45
While negotiations are underway, there are a nator technology in developing countries (Overseas Development Council, Washington DC, 1989).
number of things that should be done. With and, third, it should phase out the use of 9. James, C. Global Review of Commercialized Transgenic Crops:
1998. ISAAA Briefs No. 8. (International Service for
little competitive loss, seed companies could antibiotic-resistance genes as a means of
Acquisition of Agri-biotech Applications, Ithaca NY, 1998).
agree to use the PVP system (including selecting transgenic plants. Alternatives exist 10. Mann, C. C. Science 283, 314–316 (1999).
provisions allowing seed saving and sharing and should be used. 11. Ye, X. D. et al. Science (submitted).
by farmers) in developing countries in coop­ The Rockefeller Foundation and other 12. The Royal Society of London. Genetically Modified Plants for
Food Use (The Royal Society, London, 1998).
eration with public plant-breeding agencies, donors have invested significant sums in 13. Nuffield Council on Bioethics. Genetically Modified Crops: The
rather than using patents or terminator helping developing countries put in place Ethical and Social Issues (Nuffield Council on Bioethics,
technologies to protect their varieties. biosafety regulations and the facilities neces­ London, 1999).
To speed the development of biotechnol­ sary for biosafety testing of new crops and 14. UN Food and Agriculture Organization. Biotechnology and
Food Safety. FAO Food and Nutrition Paper 61. (World Health
ogy capacity in developing countries, foods, but much more needs to be done. The Organization/FAO, Rome, 1996).
companies that have IPR claims over certain big life-science companies could join forces 15. Rissler, J. & Mellon, M. The Ecological Risks of Engineered Crops
key techniques or materials might agree to and establish a fellowship programme for (MIT Press, Cambridge MA/London, 1996).
16. May, R. Genetically Modified Foods: Facts, Worries, Policies and
license these for use in developing countries training developing country scientists in
Public Confidence (http://www.2.dti.gov.uk/ost/
at no cost. crop biotechnology, biosafety, intellectual ostbusiness/gen.html, 1999).
We would also like to see an agreement to property rights and international negotia­ 17. Pretty, J. The Biochemist (in the press).
share the financial rewards from IPR claims tions administered by a neutral fellowship Acknowledgements. We thank M. Lipton, S. Dryden, R. May and
on crop varieties or crop traits of distinct agency. colleagues at the Rockefeller Foundation for comments on an
national origin, such as South Asian Basmati Most important of all, a new way of earlier draft of this article.

C58 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
timescales

Adapting to climate change


Legislating for the present on the basis of predictions for the future can never be other than a dicey business. The suggestion by a British member of parliament in the
nineteenth century that London would be waste-high in horse excrement by the 1950s could have been seen as a call for crippling taxes on Hansom cabs, pushing
cabbies out of trade while still leaving us choking on exhaust fumes. So should we now be legislating against those exhaust fumes, in fear that by the
end of the next century the world will be a flood- and storm-prone hothouse?
This was one of the key issues highlighted by the Kyoto convention on climate change in 1997. It is all very well to set targets
for emissions reductions on the basis of climate models projected over the next century (and then in all probability to fall short of
attaining them anyway), but what of the costs of fierce constraints to today’s populations, particularly those who live within
fragile economies?
Yet clearly we have to do something. There seems to be no serious doubt now that global warming, induced by
emission of greenhouse gases through human activity, is upon us. The global average temperature has risen by
around half a degree Celsius during the twentieth century, and is predicted to increase by a further 1.0–3.5 ºC by
2100. Over the same periods, the seas have risen by 10–25 cm and we can expect almost a metre more at the
upper limit. Storms and drought may or may not get more common, but their distribution will probably alter.
Massive famine and flooding cannot be ruled out.
As it happens, the Kyoto targets may make very little impact on all of this, but the basic question remains:
how can we plan for social change on timescales over which forecasts of enabling technologies become unreliable?

One approach that has been advocated is to formulate policy based on assessments of human and social impacts rather

than in terms of national emissions reduction targets — something that is harder to quantify but closer to people’s concerns.

Emissions are not the whole story; what about economic efforts to make societies vulnerable to climate
change less so? And how far do political and industrial structures need re-engineering to make actual attainment of objectives
realistic?
Climate modellers, meanwhile, are trying to incorporate some element of the social and economic contexts into
their forecasts. They grapple with the question of whether it is better, given the uncertainties, to brake hard on emissions now,
do nothing until later on the assumption that we will then be much more technologically capable of it (owing to enhanced energy
efficiency and so forth), or something in between. Inevitably, such modelling is painfully contentious and strongly influenced by
the assumptions that go into it. We are juggling with the unknown; and in the face of massive inertia and sharp political lobbying,
that is a precarious occupation. Philip Ball

The shape of the cosmos


Olaf Stapledon is the great unsung hero of twentieth-century science fiction. His stories are cast on scales of time and
space so vast as to inspire nothing short of vertigo or terror. In Star Maker, his hero, in a ‘hawk-flight of the imagination’,
journeys through the cosmos and eventually meets the Creator in his workshop. The Star Maker, it seems, is still perfecting
his art. Our own cosmos is only provisional. Previous essays in Universe construction litter the room like trash. These essays
come in all shapes and sizes, suggesting that we need not live in a great, expanding hypersphere, infinite in all directions.
Our conception of the shape and size of the Universe is based on timescales. The familiar ‘light-year’ is a unit
based on the time it takes for a beam of light to get from A to B. But there is a reckoning that is purely geometric, or,
rather, trigonometric. This is the ‘parsec’, short for ‘parallax arcsecond’, the distance at which the separation of the Earth
and Sun (1 Astronomical Unit) would subtend an angle of one second of arc. A parsec is equivalent to just over three
light-years.
But the dimensions of the cosmos are reckoned in megaparsecs — millions of parsecs. At the cosmic scale, the
uncertainty over the rate at which the Universe is expanding means that time, distance and geometry are
inextricably bound together: the Hubble flow is meted out in units of kilometres per second per megaparsec. On the very
largest scales, time either loses all meaning or becomes subservient to geometry. Our understanding of time may depend on
knowing the shape of the cosmos — the shape on the Star Maker’s workbench.
In 2002, we may know. By then, we should have results from detailed maps of fluctuations in the cosmic microwave
background (CMB) from two spacecraft: NASA’s Microwave Anisotropy Probe and the European Space Agency’s Planck
Surveyor. The results will help constrain values for important cosmological parameters such as such as the density of matter in
the Universe, and the Hubble constant that provides a scale to its expansion. Accurate knowledge of these parameters should,
in turn, constrain ideas about the geometry of space — whether it is ‘flat’ or ‘Euclidean’, or if it has positive (that is, spherical)
or negative (hyperbolic) curvature.
The results will also be able to tell us whether space is infinite in all directions, or is geometrically finite.
If it is finite, then some extremely weird things could fall out of the analysis. In a finite Universe, surveys of deep space
could be pulling out multiple images of the same finite set of galaxies — like an observer in a hall of mirrors. One
of those faraway galaxies could be our own, its light having circumnavigated the cosmos to reach us. Study
of the CMB will be able to tell us whether space really is a hall of mirrors, and, if it is, something of the
hall’s architecture. The Universe could be an analogue of a simple sphere. On the other hand, it could
have some more exotic shape, akin to a torus, or another of those shapes littering the Star Maker’s
garage floor. Henry Gee

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C79
impacts

AKG
Plus çà change
J. L. Heilbron and W. F. Bynum

For the past couple of centuries the penchant for


prediction has been prevalent at century turns. How
much have evaluations of scientific discovery and
predictions for future advancement changed since
those of the science commentators at the end of
the last century?

sermon of the century. Like Lévy, Lockyer

C
entury turns have only recently
become occasions for stocktaking, did not doubt that science-based progress
millennially speaking. Perhaps the would continue; but unlike Lévy, he fretted
earliest sustained retrospective occurred that it would not take place in Britain: Unless
around 1800. Among the serious items then we put more science everywhere, he said, in
produced were histories of science spon­ our schools, industry and government, we
sored by the University of Göttingen; their will certainly lose out, in accordance with
seriousness may be indicated by the length of our countryman’s doctrine of the survival of Century ends encourage scientists to look back
the entry for physics, which ran to 8,000 the fittest, to Europe and the United States. and forward, like the two-faced Roman god
pages. The briefer British offered a crisp This cry raised a chorus then as it does now. Janus.
choice between an improving and a degener­ The chemist Sir Henry Roscoe advised
ating world, William Godwin (Essay on Polit­ that unless the government poured money
ical Justice, 1794) plumping for progress, into education, “our children and grand We find the same charges and defences at
Thomas Malthus (Essay on Population, 1798) children may see England sink to the level the end of the last century, although admit­
arguing the easier case for decay. But it was of a third rate Power.” And a writer in the tedly in a livelier literary form. Science is
not until the turn of the last century that Fortnightly questioned whether “England immoral (“[a scientist] will view his moth­
reviewers and previewers created the genre [would] survive the century.” er’s tears not as expressions of her sorrow,
to which this special supplement to Nature is We continue our theme that every divina­ but as solutions of muriates and carbonates
devoted. As our contribution to it, we illus­ tion or evaluation, optimistic or pessimistic, of soda”); anti-religious (“what does it
trate the proposition that the creators of the made about science around 1900 has an matter whether the conceptions men form
genre may have exhausted it. analogue in writings dating from our times, of the universe are those of the nineteenth
It is said that ours has been the century of under the rubrics science wars, the art of war, century or of the ninth, in comparison with
science. That is exactly what the mathemati­ the war on disease, successes and failures, their acceptance of [religious] truths?”); and
cian Maurice Lévy, president of the Paris new technologies, and new sciences. arrogant, selfish and dehumanizing (“bring
Académie des Sciences, told his confrères at up a woman in the positivist school and you
their last meeting in the last century. (The Science wars make of her a monster, the very type of ruth­
meeting occurred in December 1900, which Recently, especially in the United States, less cynicism”). In a famous episode in 1895,
shows that the French accepted the opinion scientists have waged what they began as a a French journalist returned from a visit to
aired the preceding January in Nature, that defensive war against the carping of human­ the Vatican with the news that science was
centuries begin with years ending in 01.) ists of the Luddite and constructivist schools. bankrupt. French scientists responded with
Lévy observed that only during his lifetime The defensive scientists objected to the accu­ a huge banquet in honour of the chemist and
had science descended from the heavens (he sation that science produces only pollution, minister Marcellin Berthelot, at which
had in mind celestial mechanics) to inspire weaponry and other instruments of social speakers took turns bashing the religious
and direct ‘industrial mechanics’, the appli­ control, and, moreover, that it has no claim to right and defenders of the old ways of life.
cation of science to the arts. The results had truth; like anything else, according to the The guardian of morality is not religion, but
been astonishing, he said, particularly in the constructivists, it is the product of negotia­ science, they said, for science inculcates
movement of goods, people and informa­ tion, authority and influence. The law of probity, accuracy, tolerance and modesty(!).
tion. Would progress continue unabated? gravitation is as much social convention as Science emancipates the human spirit —
Perhaps. Lévy doubted that even heavenly natural necessity. And, say the Luddites, “humanity knows no other guide.” This
mechanics could solve the next problem science undermines religion and morality by bombast, widely repeated at the time, has
in commerce and travel, namely, heavier­ insisting on evolution and providing the con­ reappeared in almost the same words in our
than-air flight. But as he rightly observed, traceptive pill. Fearing that the humanists’ own science wars.
scientists should not cast horoscopes. challenge might damage the reputation of
Today we worry that our progress might science among the general public who pay the The art of war
slacken if emphasis on application dries up research bill, scientists attacked the construc­ Many people condemn the entanglement of
funds for research in pure or basic or non­ tivists’ constructions as opaque drivel, as in science with the military. Others recom­
mission-orientated science. Lévy worried some cases it is. As for organized religion, mend it as a good route to high-tech
about it too. He warned our century against where the stakes are higher, the response is consumer goods and support of basic
neglecting the essential source of useful more nuanced. There is tolerance and even research. Much can be said for the latter
applications, that is, useless research. The attempts at rapprochement between some point of view. The world wars vastly acceler­
then editor of Nature, Sir Norman Lockyer, scientists and some theologians, but summa­ ated the development of aviation, radio,
took Lévy’s warning as the text for his first ry rejection of the beliefs of the creationists. electronics and nuclear power; the Cold War
C86 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

CORBIS/BETTMANN/AP
The printing press, telegraph, railroad and steam engine are depicted in a ...while in 1999, visitors at an industrial fair view the future of engineering
late-nineteenth century lithograph ‘The progress of the century’... at a virtual reality exhibit.

brought the computer; the Race for Space, biodiversity is less obvious at the disease­ phrenology and of psychical research, he
the teflon pan. All the basic sciences pros­ causing level. We worry, too, about escalating insisted, and the delusion that smallpox vac­
pered under military funding in the United health costs and distribution of resources, cination could prevent smallpox persisted.
States and the Soviet Union. And how was it about whether doctors have too much or too In each case, Wallace was convinced that the
in 1900? Lévy extolled the contributions of little power and whether ‘managed care’ is the future would vindicate his position. It has in
the military to nineteenth-century civiliza­ answer. In 1900, health costs were also on the part: phrenology now has its own website,
tion: “Today’s cannon is one of the most rise and many charity hospitals round the psychical research has its Koestler chair, and
instructive laboratories of science.” Experi­ world struggled to make ends meet. Several the view that smallpox vaccination played a
ments done in gun barrels showed how to European countries had already introduced part in the origin and spread of AIDS has its
contain high pressures in steam engines and national health insurance schemes, Britain supporters. But the most important failures
internal combustion machines; civilization was soon to follow and even the United States of Wallace’s century were moral, political or
gained ocean-going steamships, powerful was flirting with what has since become a economic: the greedy exploitation of the
locomotives and better steels for pistons and heretical idea. Polio was a rare disease, almost poor by the rich; the growth in the number of
rails. Nature regularly reported on new as scarce as it has become in the developed millionaires; the neglect of India and other
armaments as advances in applied science. world; malaria was already earmarked as a poor countries; militarism; the desecration
The pronouncements of the Kaisers are a preventable disease; and, even without of the natural world; the self-serving of cor­
good quarry for a symbol of the symbiosis of antibiotics, tuberculosis was perceived as in rupt politicians; the failure to realize the vast
science and the military. In 1900, dressed in retreat. Sir William Osler predicted that, by possibilities for human welfare that science
the uniform of an officer of engineers and 2000, doctors from the Old World might even offers. Here, unfortunately, Wallace could
surrounded by generals and admirals, journey to the New, in search of inspiration have been speaking of our age as well as his
Wilhelm II told an assembly at the Technis­ and training. The Lancet was convinced that own. Whether his optimistic forecast (that
che Hochschule in Berlin that their brand of the most striking achievement of nineteenth­ “the flowing tide is with us”) also applies to
science applied to commercial and military century medicine was to abandon the notion us we must wait to see.
warfare was true science, and worthy of the of specific drugs against disease; its leader­
PhD; whereupon he awarded them the right writer for 5 January 1901 declined to predict New technologies
to grant the doctorate in philosophy. how much more would be achieved for We take the hardest case. The computer is
Just as in our time many people believe practical medicine by science. touted for creating a revolution in the acqui­
that atomic bombs prevent world wars, so, at sition and processing of information.
the end of the last century, observers of the Successes and failures Profound social and cultural consequences
art of war, for example Sir Michael Foster and Alfred Russel Wallace was not a man to let so have already followed, for example, in finan­
Alfred Nobel, expected that the constantly momentous an occasion as the passing of a cial transactions and the decentralization of
improving mechanization of warfare in their century go by without summing its plusses the work place. And, if half the predicted
time would force governments to keep the and minuses. Co-discoverer of the principle developments are realized in twice the time
peace. of natural selection, polymath, and a self­ foreseen, there will be no new books, no
made man of integrity, Wallace reckoned shops, no banks — only the computer and
The war on disease that in his “Wonderful Century” scientific, the telephone — by the middle of the new
The worldwide eradication of smallpox has medical and technological successes prepon­ century. Without pretending to evaluate
encouraged optimists to believe that many derated over missed opportunities and social degrees of revolution, we must insist that
other diseases will soon go extinct. Polio disintegration. Like many scientific com­ social commentators around 1900 expressed
might be a candidate, and the distribution of mentators, he decried the notion of progress themselves as hyperbolically about the
malaria, we are told, is currently being rolled in art and design. In science, however, he Victorian revolution in communication and
back, even while we await the long-promised counted 24 major achievements in the transportation as the nerds of our time write
vaccine. HIV, lassa fever and drug-resistant century (including railways, the telephone, about computers. Take Sir William Hunter,
tuberculosis have tempered current predic­ the theory of evolution and lucifer matches) speaking as president of the British Associa­
tions about the conquest of microbial against only 15 in all preceding ones. tion for the Advancement of Science in 1900:
disease, and prions have extended notions The century’s failures stimulated him to “nineteenth-century man has climaxed his
of the ‘germ’. The lamentable decline in prophesy. There had been a woeful neglect of advance from barbarism to civilization by
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C87
impacts

Curious current

CORBIS/THE ADVERTISING ARCHIVES


notions about the
effect of gravity in the
Southern Hemisphere
are matched by
nineteenth century
geographical
oversight. The
transatlantic cable
speeded
communication
between Britain and
the United States at
the end of the
nineteenth century
(left), while today
worldwide
communication by
e-mail is the norm
(right).

overcoming space and time...he has studded other disciplines whose futures are described show the strength of the method — not
the ocean with steamships, girdled the earth so favourably elsewhere in this issue have because we have any faith in the predictions
with electric wire, tunnelled the lofty Alps, superseded what used to be respected as the — we shall make a few.
spanned the Forth,” and so on. The train, most fundamental of the sciences.
steamship and telegraph not only worked the The conscientious academic advisor ● Because, despite many better reasons to
obvious transformations in commerce, today would have to give an intending scien­ come to an end, civilization has not done
communication and conquest, but also tist the same advice that Max Planck received so, we predict that Y2K will not kill it.
changed the practice of science. Suddenly the over 100 years ago: opportunities in funda­
world knew technical advisory committees, mental physics are restricted, do something ● Because, despite increasingly noisy threats
scientific missions, international scientific else. The advice given to Planck referred to from computer enthusiasts, the number of
organizations, international centres for data employment, not to substance; fundamental new books published in print continues to
collection and — that essential ingredient of physics then was at an exciting stage, experi­ grow, we predict that the computer will
the modern life of science — international encing one Grand Synthesis after another: not slay the printed book, at least not
scientific meetings. heat with mechanics, light with heat, before nature slays us.
Just as cassandras in our time decry the electricity with magnetism, all with all. It
invasion of privacy and the annihilation of appeared that basic understanding of the ● Nature, however, will not die.
society threatened by the computer, so critics material universe was in place, or almost so,
in 1900 castigated the new means of trans­ apart from those troublesome clouds that ● No matter how little or great the progress
portation and communication as subversive Lord Kelvin spied in the sunny sky of physics. made in any science, its practitioners will
of civilized life. The tram, train and under­ He saw well, since it took relativity to dis­ continue to call for more research in it.
ground pushed cities with their moral and perse one of the clouds and quantum theory
physical pollution into the clean countryside. to drive off the other. ● Experimental high-energy physics may
Max Nordau, a physician who specialized in The depth of pessimism about the fate of not survive the century.
degeneration, identified travel, information physics came in 1895. The Deutsche
and the telephone as causes of the morbid Physikalische Gesellschaft was then celebrat­ ● Everything we say about the prospects of
anxiety from which his patients suffered. ing its fiftieth anniversary. Its president, science in the twenty-first century will
Unless, he said, we can adapt to reading acres worried by the apparent senescence of his have a precise analogue in writings that
of newspapers a day, communicating simul­ science and the recent deaths of three of its will appear in print 100 years from now.
taneously by telephone with friends in five greatest representatives, Helmholtz, Hertz
continents, and flying around the world in a and Kundt, could manage only the lukewarm J. L. Heilbron is at Worcester College, Oxford OX1
few hours (Nordau here was more optimistic hope that physics would not decline. By the 2HB, UK. W. F. Bynum is at the Wellcome Institute
than Lévy), it is all up with us. time he came to publish his remarks, he had for the History of Medicine, 183 Euston Road,
learned about the discovery of X-rays. Had London NW1 2BE, UK.
New sciences he known about Röntgen’s discovery, he
When the US Congress cancelled the Super said, “[I] would have concluded in an entire­
AKG

Collider, the search for the material basis of ly different tone...expressing my joy that the
things was forced to stop before the next second fifty years in the life of the society had
lowest level, where theories of everything begun as gloriously as the first.”
could be tested. Or so wrote the disappointed This flip-flop indicates an important
high-energy physicists. It may not have been characteristic of scientific prophecy. The
mere penury and the end of the Cold War that prophets typically base their estimates on
broke faith with the Greeks; science may have their experience of the recent past. The
outgrown the quest. Recruitment into parti­ future will be bleak or rosy just as the present
cle physics and other physical sciences became is. More reliable predictions may, perhaps,
more difficult. The biological sciences and the be generated from a larger view. Merely to
C88 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

The future of public health


Barry R. Bloom

Public health deals with the health and well-being of the population as a
whole and its achievements over the past century, especially in the richer
countries, have been truly impressive. What direction should public health
take in the future?

Disparities remain

I
t is an astonishing fact that half of all the (although it has increased ominously to 36
increases in life expectancy in recorded per cent of US teenagers and is still The bad news is that the benefits of biomed­
history have occurred within this century rising). Another contribution to cancer ical science and public health have not been
and that most occurred in the first half of the prevention on a large scale was made when made available to everyone. The disparities
century, before the introduction of modern tamoxifen, a drug used to treat cancer, was are striking.The country with the highest life
drugs and vaccines. Life expectancy in the found to reduce the incidence of breast expectancy is Japan, where people live on
United States, for example, has risen from 47 cancer by 45 per cent in women at high risk. average to the age of 80 years; Sierra Leone
years in 1900 to 78 years in 1995. It is interest­ This is an example of a drug thought of as a has the lowest, just 37 years in 1998. And the
ing to analyse changes in life expectancy therapeutic in clinical medicine becoming a disparities are not just between countries.
across different countries in this century as a public health tool in ‘primary prevention’, Most people believe that in industrialized
function of per capita income (Fig. 1). In that is, preventing disease in individuals countries everyone can expect a relatively
addition to the obvious increases in life already known to be at risk. long life and that the major health issues
expectancy globally over time, two addition­ Whereas population growth in the indus­ centre on the quality of life and health care
al points are worth mentioning. trialized world has reached almost steady­ rather than the quantity of life. It is a shock,
First, when people are poor, they die state levels, infant mortality in the United therefore, to learn that people born in partic­
young, and miniscule increases in per capita States has declined by 26 per cent in the past ular rural counties of Minnesota, Colorado,
incomes can have a major impact on life decade and is at the lowest levels ever. And Iowa or Wisconsin on average will live 25
expectancy. Second, even if you were enor­ even these impressive figures leave the United years longer than those born in four counties
mously wealthy in 1900, there were 25 years of States ranked at only twenty-fifth worldwide. in South Dakota, 23 years longer than in 12
life expectancy you could not buy, which in Perhaps most dramatic of all is the impact counties in Mississippi and Alabama, and 22
1990 could be gained even at relatively modest of immunization. Vaccines have eliminated years longer than people born in Washing­
incomes. That which could not be bought in smallpox from the world and polio from the ton, DC, or Baltimore, Maryland4. The vari­
1900 is, I believe, knowledge of public health Northern Hemisphere, and have reduced ance in life expectancy in the United States
in the broad sense. The major gains in health measles, rubella, tetanus, diphtheria and between women of Japanese extraction in
in this century have been attributable largely meningitis in many countries to a handful of Bergen County, New Jersey, and Bennett
to the impact of public health and disease pre­ cases each year, at a saving of millions of lives County, South Dakota, is 41 years. Overall,
vention, rather than to medical interventions. and billions of dollars. Vaccines remain the disparities in life expectancy between differ­
most cost-effective intervention known for ent parts of the United States are greater than
Advances in public health preventing death and disease. Indeed, such is for any other nation in the world.
Public health is best distinguished from clin­ the success of immunization that this year, Curiously, the reasons underlying the
ical medicine by its emphasis on preventing for the first time, infectious diseases are no disparities in the United States are by no
disease rather than curing it, and its focus on longer the largest cause of death worldwide3. means clear, as in many counties the correla­
populations and communities rather than tion between per capita incomes and life
the individual patient. Perhaps because of 80 expectancy is not particularly good. For
this emphasis on large numbers of people, 1990 example, per capita income is significantly
the achievements of public health in this 1960 higher in the county of Washington, DC,
70
century are truly impressive. In the industri­ than in several counties along the
Life expectancy (years)

alized world we take clean water for granted. 1930 Texas–Mexico border, yet life expectancy is
It is only the occasional lapses in water quali­ 60 significantly lower, by about 15 years, in
ty, such as the outbreak of cryptosporidiosis 1900 Washington. Understanding and then
(caused by the protozoan Cryptosporidium) reducing disparities in health and life
50
which sickened 440,000 people in Milwau­ expectancy, within and between countries,
kee in 1993, that remind us how important has to be an important issue on the public
safe water is to our collective health. In much 40 health agenda for the next century.
of the developing world, however, simply
drinking water is a high-risk behaviour. 30 Directions for public health policy
0 5,000 10,000 15,000 20,000 25,000
In the past two decades, deaths from The major thrust of epidemiology in the
Income per capita (1991 international dollars)
heart attacks and stroke in the United States twentieth century was to analyse the risk
have dropped by 30–50 per cent1, in part by Figure 1 Life expectancy and income per capita factors that contribute to illness and disease.
behaviour changes, in part by primary pre­ for selected countries and periods. Figure This effort was so successful that we probably
vention with medications. Smoking, which adapted from ref. 6. The income data, expressed now know all the major risk factors. But this
is estimated to be responsible for about 20 in international dollars, are an estimation of is just the beginning. There are four areas in
per cent of all deaths in the United States2, purchasing power and derive from data in ref. 7 which I believe public health will have an
has declined from 42 per cent to 25 per cent and World Bank data. increasingly important role.
of adults over 30 years in the United States First, we need to develop more sensitive
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C63
impacts

fications for overcoming predicted individ­


AP

ual disease risks, creating a new field of ‘bou­


tique medicine’.
But we live in a world in which many
countries are not sharing in the benefits of
globalization and are not experiencing
increased standards of health and quality of
life. Much of the knowledge about individ­
ual risks that will derive from the Human
Genome Project and modern biomedical
science, and the resources to obtain the bou­
tique treatments and preventions to over­
come these risks, will simply not be available
to the 85 per cent of the world’s population
Figure 2 Traffic accidents are becoming a major killer in developing countries. who comprise the Third World. One can
only hope that from the scientific knowledge
gained in our current pursuit of sophisticat­
epidemiological approaches that can identi­ There are, in fact, some strategies that ed approaches to treat disease at the individ­
fy the many additional risk factors of smaller have made a difference at the population ual level, there will also emerge effective, safe
effect. Also, we need to design better ran­ level, with the saving of millions of lives. and affordable preventions and treatments
domized trials so that the biases that prevent Local and national decisions have been made of relevance at the population level. It is
us from establishing causation can be elimi­ to protect the public’s health by controlling principally these interventions that will
nated. Second, epidemiological surveillance the environments that cause disease: laws make a difference in reducing the global dis­
is needed not only to trace emerging infec­ and standards have been adopted that parities in health and in improving the life
tions but, more generally, systematically to protect us from health hazards in water, expectancy and quality of life of people in
ascertain the burden of disease (that is, the food, milk, alcohol and the workplace, and even the poorest countries and parts of
years of healthy life lost because of each dis­ public policies on vaccination have been countries.
ease). Together with analysis of the economic developed. Now we require a greater under­ We know that cardiovascular disease,
costs of those diseases to individuals and standing of human behaviour, and must infectious diseases, psychiatric disease and
society, and the costs of the interventions learn to develop more effective strategies for physical injuries represent the major global
available to prevent or treat them, this will changing harmful behaviour. Whereas burdens of disease and disability in industri­
enable us to set rational priorities for public cigarettes are currently advertised in every alized and developing countries alike. The
spending on research. Such research should country of the world, no one is spending challenge for biomedical science and public
be aimed, for example, at establishing where billions of dollars to advertise the pleasures health in the coming century is to develop
effective interventions are lacking or too of salmonella or cryptosporidium contami­ the population-based interventions needed
expensive to be widely used, and where nation of the water! The challenge is to to reduce these burdens. Vaccines to prevent
resource allocations for treatment can result change behaviour in the face of massive AIDS, malaria, tuberculosis, dysentery and
in the most years of healthy life per unit of advertising of tobacco, unhealthy foods other respiratory and diarrhoeal diseases are
health expenditure. and alcohol (which do indeed impart needed. In addition, effective drugs to pre­
While an analysis of cost-effectiveness pleasure, at least in the short term). As many vent as well as to treat cardiovascular disease
will be helpful, it will have to go hand in hand health-damaging behaviours are cumula­ and psychiatric illness are required, as are
with an assessment of the quality of health tive, we need to reduce unhealthy behaviour effective interventions to prevent injuries,
care and health-care systems in order to set in as many people as possible. If we focus at for example, improving road and automo­
rational priorities in health spending. In the the population level, we will benefit vast bile safety worldwide (Fig. 2) and preventing
United States alone, a trillion dollars is spent numbers of individuals. injuries due to falls in the elderly. That, in my
on health care — half of the entire global Whether public health will have the same view, will be the major global challenge for
expenditure on health. Thus, the third area impact on health in the twenty-first century biomedical science and public health in the
in which public health should have a growing as it has in the twentieth is unclear. However, twenty-first century.
role is in understanding the burdens and it is not unreasonable to speculate that the Barry R. Bloom is at the Harvard School of Public
costs of interventions, and improving the unfolding knowledge of the human genome, Health, 677 Huntington Avenue, Boston,
quality and efficiency of the health service. together with epidemiology and biostatistics Massachusetts 02115, USA.
Finally, one of the myths of the modern (core disciplines in public health) and com­ e-mail: bbloom@hsph.harvard.edu
world is that health is determined largely by putational biology, will have an increasingly 1. Center for Disease Control. Deaths: Final Data for 1997 Vol. 47
individual choice and is therefore a matter of important role in relating genetic variation (National Center for Health Statistics, 1999).
2. McGinnis, J. M. & Foege, W. H. J. Am. Med. Assoc. 270,
individual responsibility. In fact, most and differences in gene expression to indi­ 2207–2212 (1993).
behaviour is socially patterned and rein­ vidual disease susceptibilities. Studies of 3. World Health Organization (WHO). World Health Report 1999
forced in groups. Just providing health infor­ large cohorts, such as the Harvard Nurses 1–121 (WHO, Geneva, 1999).
4. Murray, C. J. L., Michaud, C. M., McKenna, M. T. & Marks, J.
mation to people is not an effective way to Study — a group of 121,000 healthy women S. US Patterns of Mortality by County and Race: 1965–1994
change their behaviour. To make a difference followed for 23 years5 — will enable investi­ 1–97 (Harvard School of Public Health, Cambridge,
in conditions that we know can be prevented gators prospectively to link not only extrinsic Massachusetts, and Centers for Disease Control and
(such as those due to smoking, alcohol and and environmental influences such as diet or Prevention, Atlanta, Georgia, 1998).
5. Colditz, G. A. et al. The Nurses’ Health Study: a 20 year
drug abuse, and obesity, hypertension and smoking to the risk of developing a particu­ contribution to the understanding of health among women. J.
sexually transmitted diseases) it is essential lar disease, but to associate extrinsic risks Womens Health 6, 49–62, 1997.
that we develop a public health approach with intrinsic genetic susceptibility and 6. World Bank. World Development Report 1993. Investing in
Health p. 34 (Oxford Univ. Press, 1993).
that will protect populations and establish resistance. This will essentially allow those in
7. Preston, S. H., Keyfitz, N. & Schoen, R. Causes of Death: Life
prevention strategies for groups, not just for the rich countries to create individually tai­ Tables for National Populations (Seminar Press, New York,
individuals. lored drug regimens and behavioural modi- 1972).

C64 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Transitions still to be made


Philip Ball

A collection of many particles all interacting according to simple, local rules


can show behaviour that is anything but simple or predictable. Yet such
systems constitute most of the tangible Universe, and the theories that
describe them continue to represent one of the most useful contributions of
physics.
That such a versatile discipline as statisti­

P
hysics in the twentieth century will
probably be remembered for quantum cal physics should have remained so well hid­
mechanics, relativity and the Standard den that only aficionados recognize its
Model of particle physics. Yet the conceptual importance is a puzzle for science historians
framework within which most physicists to ponder. (The topic has, for example, in
operate is not necessarily defined by the first one way or another furnished 16 Nobel
of these and makes reference only rarely to the prizes in physics and chemistry.) Perhaps it
second two. The advances that have taken says something about the discipline’s
place in cosmology, high-energy physics and humble beginnings, stemming from the
quantum theory are distinguished in being work of Rudolf Clausius, James Clerk
important not only scientifically but also Maxwell and Ludwig Boltzmann on the
philosophically, and surely that is why they kinetic theory of gases. In attempting to
have impinged so forcefully on the derive the gas laws of Robert Boyle and
consciousness of our culture. Joseph Louis Gay-Lussac from an analysis of
But the central scaffold of modern the energy and motion of individual
physics is a less familiar construction — one particles, Clausius was putting thermody­
that does not bear directly on the grand ques­ namics on a microscopic basis. But from a
tions that physicists are popularly expected modern perspective, his programme was
to address but instead defines our current deeper still: he was attempting to understand
understanding of phenomena at the prosaic the collective behaviour of interacting,
energy and length scales characteristic of our many-body systems. This, it might be
everyday experience. Statistical physics, and Figure 1 The Ising model at the critical point. argued, is the defining objective of statistical
more specifically the theory of transitions Each site on this two-dimensional lattice can physics in all its guises.
between states of matter, more or less defines adopt one of two states — black or white, At least with (dilute) gases one can afford
what we know about ‘everyday’ matter and corresponding to ‘up’ or ‘down’ spins in a to neglect interparticle attractive forces with
its transformations. ferromagnet. At the critical point, neither state some justification. Phase transitions enter
Moreover, it provides the conceptual predominates, and fluctuations occur on all into the picture, however, when those forces
apparatus for tackling complex collective length scales. (Courtesy of Alistair Bruce, are included. Johannes Diderik van der
quantum phenomena of intense topical University of Edinburgh.) Waals, who introduced such forces in a
interest such as Bose–Einstein condensation heuristic manner using what we would now
(in which a collection of particles all occupy call a mean-field theory, found that he could
the same quantum ground state) and high­ Critical ideas describe the gas–liquid transition. In van der
temperature superconductivity (that is, ‘Phase transition’ is today a debased term — Waals’ theory, the particles have a hard
superconductivity above about 35 K) . Many like the classical equivalent of ‘quantum leap’, repulsive core and an infinitesimally small
of the states of condensed matter that it tends to attach itself to any abrupt change in attraction of infinite range (although this is
promise new technological applications, a system’s behaviour. Does a single molecule, not the way the Dutchman expressed it).
ranging from block copolymers to magnetic such as a protein, undergo a phase transition Van der Waals was awarded the Nobel
multilayers, can be understood as the conse­ if it abruptly changes conformation? In the prize in 1910 and is regarded as something of
quence of the kind of collective behaviour strict sense, no. A genuine transition requires a founding father for statistical physics. So
that statistical physics describes. that there be some singularity in a thermody­ far did his vision penetrate that in 1998 the
There are still central issues in cosmology namic potential (such as the Gibbs free physicist Ben Widom, in his Boltzmann
and high-energy physics whose solution energy), which in itself requires that one can Medal address, could still ask “what do we
requires an understanding of phase transi­ characterize the states of the system in a know that van der Waals did not know?”, and
tions, not least the primordial symmetry­ ‘thermodynamic limit’ of infinite system size. answer “not very much”1. In particular, he
breaking transitions that distinguished the But too much generality may be no bad thing, was well aware not only of the gas–liquid
fundamental forces and gave particles their if it drives home the message that phase critical point (which his equation predicts)
masses by means of the Higgs mechanism. transitions occur not only when a liquid but also of the existence of critical exponents,
And in its most generalized form, statistical freezes or evaporates but also throughout the which describe mathematically how various
physics is promising to offer insights into (once sub-microscopic) Universe as it cools, properties vanish or diverge at the critical
phenomena once considered outside the or in a superfluid as its viscosity vanishes. The point. It is at this unique point in the ‘phase
physicist’s domain: traffic flow, economics, point is that phase transitions are global and space’ of temperature, pressure and density
cell biology and allometric scaling (the abrupt — they show matter behaving at its that a liquid and gas cease to be distinct and
relation of biological functions to body most nonlinear, with effects quite out of separated by a phase transition: above the
mass), to name a few. proportion to cause. critical temperature, there is only one fluid
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C73
impacts

phase. Thermally driven fluctuations in scales (Fig. 1). (Gaussian random noise, in always very evidently phase transitions —
density of the liquid and gas (caused by the contrast, generates fluctuations of a charac­ abrupt changes from a resistive to a
mere randomness of particle motions) teristic average amplitude.) non-resistive state, from a viscous to a
become increasingly pronounced as the The principle of renormalization is non-viscous fluid. It was not until 1938, how­
critical point is approached; and their to capture the fundamental probability ever, that the connection to statistical physics
range becomes infinite exactly at criticality, distribution of the different states of the was made, when Fritz London pointed out4
dragging with them so-called ‘response system by ‘coarse-graining’ — a kind of that superfluidity in liquid helium might be
functions’ such as the fluid’s compressibility. mathematical squinting that eliminates the result of a kind of quantum condensation
From the 1960s to the 1980s, nothing extraneous detail. In a lattice model such as transition, in which the particles of the sys­
obsessed statistical physicists more than the Ising model, where the particles occupy tem become bosonic (that is, having integer
critical points. It seems strange, at first glance, sites on a regular grid, this involves calculat­ spin) and so capable of occupying a single
that so much attention should be focused on a ing the average state of blocks of sites of quantum state. It became generally accepted
specific location in the phase diagram; but the specified size. Thus, whereas in the Ising that these phenomena were examples of
reasons are twofold. First, the behaviour of a model each site is assigned a two-state Bose–Einstein condensation, although the
system at its critical point also determines variable (‘up’ or ‘down’ spin, say, represented exact connection remains murky.
its behaviour in the broad vicinity too, within by the black and white squares in Fig. 1), the In superconductivity the fermionic
the so-called critical region. The fluctuations renormalized system contains a broader (spin-1/2) electrons become bosons by
that overwhelm the system at the critical spectrum of averaged ‘block variables’. The forming Cooper pairs, a many-body effect
point remain significant well beyond it; one of interaction strength between blocks is that (in the conventional low-temperature
the reasons why the (controversial) idea of rescaled accordingly. superconductors) results from an effective
a high-pressure, low-temperature liquid– Progressive rescaling at different block attraction mediated by the electrons’ inter­
liquid critical point in water2 is so stimulating sizes smoothes out ever-larger fluctuations. action with lattice vibrations (phonons) in
is that it might be expected to affect the liquid’s At temperatures either side of the critical the crystal. The full details of that process
behaviour under everyday conditions. temperature, this makes the system ‘look’ were determined in the 1950s by John
But second, behaviour of a system at a ever further from criticality — it begins to Bardeen, Leon Cooper and Bob Schrieffer5.
critical point is like a badge of identification: resolve itself into one equilibrium state or the It is significant that this is one of the many
it reveals kinships between different systems. other. Exactly at the critical point, however, phase transitions that can be described in an
Liquid–gas criticality and the behaviour of rescaling creates a patchwork rather like that approximate way by the phenomenological
some magnets at their Curie point (the in Fig. 1 (but with grey squares too) no theory developed by Lev Landau and Vitaly
temperature above which they lose their matter how large the blocks become. The Ginzburg in the 1950s. This stemmed from a
ferromagnetism) have numerically equal probability distribution of block variables very general mean-field model of phase
critical exponents, and both can be modelled settles down to an invariant form, peaked transitions proposed earlier by Landau,
by the so-called Ising model, a lattice of on ‘black’ and ‘white’. The way in which whose contribution to this area of con­
two-state spins. Commonality of critical this ‘configuration flow’ evolves with densed-matter physics was pivotal. Landau’s
exponents gives rise to the idea of universali­ changing length scale allows one to theory — a kind of generalized and aug­
ty — that is, there are generic models in determine the precise value of the critical mented all-purpose van der Waals equation
statistical physics that describe a variety of exponents — which are generally different, — remains the first port of call for any
apparently different many-body systems. in one, two and three dimensions, from their simplified model of a phase transition.
This means that solving one statistical mean-field values. The observation of Bose–Einstein con­
mechanical problem generally delivers densation in atomic matter, seen for the first
solutions for several others at the same time; Quantum transitions time in cooled sodium atoms in 19956, was
it also implies that, fundamentally, many­ One of the richest veins of statistical itself another instance of a quantum phase
body behaviour is determined only by mechanics presently being mined is found at transition — made possible now by laser
broad-brush features such as the range of its intersection with quantum mechanics. cooling techniques.
interparticle forces, the dimensionality, and In particular, the many-body behaviour of But is there any systematic way to accom­
the nature of the ‘order parameter’ whose electrons in condensed matter is extra­ modate quantum effects into the existing
abrupt change from zero to a non-zero value ordinarily rich. Correlated behaviour of framework of statistical mechanics? The
defines the transition. electrons, in which they display a degree of intense interest in correlated electron
Actually obtaining numerical values for collective or coherent dynamics, produces systems has now provided something of the
critical exponents from first-principles for example superconductivity, the integer kind. Conventionally, phase transitions are
theory is, however, another matter. Mean­ and fractional quantum Hall effect (quanti­ induced by changes in temperature — ice
field models offer analytic solutions, but they zation of the Hall resistance, a measure of melts when heated, hot iron orders magneti­
are strictly approximations in anything less the voltage generated transverse to a current cally when cooled. All this is driven by
than four spatial dimensions. Lars Onsager’s in a flat conducting sheet by an applied changes in thermal fluctuations. ‘Pure’
tour de force in 1944 was the exact solution of magnetic field), heavy-fermion behaviour quantum phase transitions, meanwhile, take
the two-dimensional Ising model, providing (where conduction electrons acquire a very place at zero temperature, and are induced
exact numbers for the critical exponents. But large ‘effective mass’), spin density waves and by altering some other parameter, such as
the three-dimensional Ising model contin­ colossal magnetoresistance (CMR: a strong the strength of an applied magnetic field,
ues to rebuff the advances of theoreticians, variation in electrical resistance owing to an that affects quantum fluctuations. Classical
and may be analytically insoluble. applied magnetic field). All of these fluctuations are frozen out at zero kelvin, but
On the other hand, Kenneth Wilson’s collective phenomena have in recent years quantum fluctuations, which are a conse­
renormalization group theory provided in been shown to underlie unexpected and quence of the uncertainty principle, remain.
the 1960s and early 1970s a methodology for potentially useful properties of novel materi­ Their effect can be enhanced by altering
computing critical exponents numerically3. als: CMR, for instance, could potentially some variable that alters the particles’ (gen­
It also pointed to the scale-invariant furnish highly sensitive read-out heads for erally the electrons’) state of localization,
behaviour of fluctuations at the critical magnetic memories. which is the equivalent in this case of altering
point: the fact that they occur on all length Superconductivity and superfluidity were the temperature.
C74 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
impacts

Quantum phase transitions provide a considered to be time-reversible: an job of capturing some of this behaviour, and
wider screen on which to scrutinize the encounter can be run in reverse without ‘toy’ equations such as the Swift–Hohenberg
still-enigmatic high-temperature supercon­ becoming a physical nonsense. So in what equation provide a general description of
ductivity of layered copper oxide manner does the Second Law break down in some of the symmetry-breaking processes
compounds. Superconductivity is now going from irreversible macroscopic systems that occur9; but non-equilibrium systems
recognized as just one of the manifestations to reversible microscopic ones? And what seem particularly prone to the influences of
of the rich physics of the correlated electrons is the relationship to the chaotic dynamics boundary conditions, defects and noise, and
in these systems. There are magnetic interac­ evident even in few-particle (sometimes remain resistant to too much generalization.
tions between the copper ions, which even two-particle) systems? Perhaps instead All the same, some useful broad princi­
bear spins by virtue of their unpaired there is some feature of the time-dependent ples have appeared, among them the idea
electrons. The prototype, lanthanum stron­ evolution of probability distributions that that certain non-equilibrium systems have a
tium copper oxide (close to the material forces irreversibility irrespective of scale? In kind of imposed criticality said to be
studied by Georg Bednorz and Alex Müller other words, might irreversibility be built in self-organized, in the sense that they will
in 19867), is now recognized as an example at the microscopic level (contrary to the constantly return to a precarious critical
of a diluted quantum antiferromagnet: normally accepted tenet of microscopic state following some transient instability
diluted, that is, by the ‘dopant’ strontium, reversibility)? Furthermore, the Boltzmann such as a landslide10. The characteristic
which contributes positively charged mobile position requires that, as Richard Feynman statistics of such systems, dubbed 1/f behav­
charge carriers (holes) to the copper-oxide put it, “For some reason, the universe at one iour because of the inverse relationship
layers. These holes may segregate into time had a very low entropy”. But for what between the size and frequency (f) of a
stripes, and the intervening phase, an insulat­ reason? fluctuation, provides a kind of fingerprint
ing antiferromagnet, undergoes a magnetic Second, there is the question of whether a that alerts the observer to the likely appear­
quantum phase transition at some critical thermodynamic (and corresponding micro­ ance of scale-invariant structure and
doping level. scopic) description of non-equilibrium dynamics. Self-organized criticality seems to
The behaviour of such systems at energy systems can be developed that mirrors the be a promising candidate for a general mech­
scales corresponding to temperatures above mature description of equilibrium states. As anism capable of forming the fractal
100 K or so is now well understood; but the most processes of interest, from atmospheric structures so prominent in nature, from
transition to a superconducting state at circulation to cell metabolism, take place out mountain ranges to the large-scale structure
lower temperatures involves additional of equilibrium, the issue is a pressing one. of the Universe. There is a clear kinship with
interactions of the electronic and spin A key question is whether variational or the fractal ‘optimal channel networks’ that
degrees of freedom that have yet to be minimization principles exist for selecting have been posited11 as models of real river
elucidated. At least one Nobel prize lies in the most stable dynamic state, as is the case at drainage networks. These are formed under
wait for this enterprise. equilibrium. Several have been suggested, the assumption that the evolution must
amongst them Ilya Prigogine’s minimal rate minimize the rate of potential-energy
Beyond equilibrium of entropy production for systems close to dissipation in the water flow. The resulting
Some of the major unresolved questions equilibrium; but there is no consensus, and topology of the drainage basin shows non­
about the behaviour of both everyday and at least some indication (the late Rolf random 1/f statistics. To what extent this
exotic matter are not so much a matter of Landauer’s ‘blowtorch theorem’8) that there model captures the essential features of real
being as of becoming. How does change in can in fact be no universal principle of this river systems is still a matter of debate.
these many-particle ensembles occur? Tradi­ sort, independent of a system’s past history, And it remains to be seen whether
tionally, thermodynamics has concerned away from equilibrium. In short, it matters self-organized criticality itself will prove to
itself with equilibrium states. The particles not only what state a system is in, but how it be generalizable to forest fires, epidemics,
might be in frantic motion, but the macro­ got there. solar flares and the countless other physical
scopic properties are constant observables of In fact, for all that one can postulate that systems in which such statistics have been
the states that minimize free energy: pres­ entropy production must be positive in reported. But the canonical example, the
sure, density, temperature, magnetization non-equilibrium processes (so that the sand pile, points to another growth area in
and so forth. But how do systems actually get Second Law is not violated if and when studies of the behaviour of matter — granu­
from state to state (particularly in cases of equilibrium is reached), it is not even clear lar media. Capable of both solid-like and
violent change, such as explosions or how to calculate the production rate because fluid-like behaviour, dominated by dissipa­
fracture)? And what about systems that do there is no generally accepted definition tive collisions and essentially independent of
not reach equilibrium at all? of entropy away from equilibrium. temperature, granular media represent a
First, there is the issue of irreversibility, Boltzmann’s S = klnW (relating entropy S to new class of problem for many-body
which is to say, of the Second Law of available microstates W) will not give it to us, theorists. There is still no general physical
Thermodynamics. Thermodynamic change nor will any other general law. While that is framework, comparable to the kinetic model
has a directionality to it: entropy increases. so, a quantitative non-equilibrium statistical of gases, able to predict the stupefying range
The question is why. The answer most would mechanics remains hard to formulate. One of collective and self-organized behaviour
give is Boltzmann’s, which is purely possible way forward is to use as the appro­ exhibited by grains in motion: size segrega­
probabilistic: entropy being related to the priate variables of dynamical steady states tion in vertical shaking and landslides,
number of configurations available to the the time-invariant probability measures formation of ordered patterns12 (Fig. 2),
ensemble, those states will be achieved that developed in the 1970s by Yakov Sinai, David liquefaction, hysteretic angles of instability
have overwhelmingly greater probability. In Ruelle and Rufus Bowen. and so forth. Quite aside from the relevance
the words of US physicist Josiah Willard But as yet, the theory of non-equilibrium of much of this for industrial powder
Gibbs, “the impossibility of an uncompen­ remains largely a heuristic one. It is evident processing, there is the matter of whether it
sated decrease in entropy seems to be that lack of equilibrium does not imply lack might cast light on the geomorphology of
reduced to an improbability”. of structure — many if not most of the grainy media distributed by wind and wave.
But questions, even lingering doubts, richest patterns in nature (Fig. 2) are formed Traffic flow is now understood to be a
persist. At the microscopic scale, the interac­ out of equilibrium. Microscopic models special case of granular flow, and exhibits
tion between two particles is generally such as reaction–diffusion schemes do a fair dynamic states that bear striking analogy to
NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C75
impacts

the equilibrium states of matter. In low­ entropic effect of fluctuations on interac­


density ‘free’ flow, each vehicle moves more tions of lipid membranes — there remains
or less independently, like a gas. A traffic jam much scepticism as to whether any biological
at high densities is resolutely solid-like, with phenomena can arise from the sort of
equidistant particles trapped in near or total collective, emergent behaviour of statistical,
immobility. But jams rarely nucleate from interacting ensembles rather than the closely
free flow; instead, it seems that a dense, controlled protein relays to which cell biolo­
congested yet mobile state intervenes, the gists are accustomed. Yet statistical physics
highway equivalent of a liquid state. Changes must inevitably provide the baseline even in
from one state to another seem to have the the cell: proteins may phase-separate and
abrupt character of a first-order phase membranes may adopt equilibrium confor­
transition (like freezing or melting), relying mations unless actively opposed. The recent
on the presence of fluctuations to nucleate interest in thermal ratchets16 — Brownian
the change. systems that achieve directional motion
Perhaps the hardest of non-equilibrium by virtue of operating in an asymmetric
problems — turbulence — remains as underlying potential — attests to the contri­
obstinate as ever. The mathematician Sir butions that microscopic physical models
Horace Lamb’s famous quip — that he was might make to cell biology. These models
more optimistic about receiving heavenly may or may not, in the end, have much to do
enlightenment on quantum electrodynamics with the way that motor proteins work; but
than on turbulence — proved prophetic in they demonstrate that physics offers creative
terms of which would yield first, and turbu­ solutions to adaptive biological systems. The
lence still retains something of its reputation same can be said for the idea that stochastic
as a ‘graveyard of theories’. From a statistical resonance — the (counterintuitive) noise­
physical perspective, the situation is night­ Figure 2 Complex, ordered patterns form in induced amplification of a signal17 — is
marish: every parcel of fluid acts non-trivially vertically oscillated thin layers of grains. (From exploited in biological signal transduction.
on the others, there is structure at all length ref. 12.) There now seems to be good evidence that
scales, dissipation is very strong, and the this mechanism has some role in neural pro­
solutions are wholly time-dependent. Ideas cessing, and one might even be surprised if it
from critical phenomena might prove helpful magnetic spin glasses known in condensed­ did not turn out to be more general. In both
for characterizing the scaling behaviour of matter physics has helped to explain the wide of these cases, we see how noise — inevitable
turbulent flows. Yet questions remain, for range of protein folding speeds (a function of in any environment — can, with the right
example, about just how many distinct the degree of frustration) and has provided adaptation, serve a functional purpose.
regimes of thermal turbulence (as a function the useful concept of a folding ‘funnel’, the The considerable redundancy evident in
of Rayleigh number, proportional to the broad basin in the energy landscape that the cell’s machinery — so that cells continue
temperature gradient) there are. Each regime surrounds the deep well of the native fold14. happily with disabled genes that were sup­
is characterized by a specific scaling law posedly central to their survival — suggests
relating Rayleigh number to heat transport; Is physics rigorous? the need for some kind of nonlinear collec­
recent predictions of an ‘ultrahard’ regime It can be remarkably hard to prove rigorously tive modelling of the interactions among its
relevant to atmospheric dynamics have been the development of genuine long-ranged components. This is the kind of thing that
met with conflicting experimental results13. (crystal-like or ferromagnetic) order in an physicists have been doing for years, and in
Even in equilibrium, a strong element of equilibrium phase transition — particularly increasingly complex systems. Cells provide
disorder in a system complicates the picture. if the degrees of freedom are continuous perhaps the ultimate challenge, although the
The glass transition withholds some of its rather than discrete (for example, in Heisen­ modelling will have to be dosed with a strong
mysteries still, exemplifying a whole range of berg rather than Ising models of magnetic element of biological good sense.
systems — among them spin glasses and states, where the spins can take any orienta­ Philip Ball is a consultant editor for Nature.
folded proteins — in which the problem is tion). We ‘know’ from numerical modelling e-mail: p.ball@nature.com
that of finding a global energetic minimum that such systems do adopt long-range 1. Widom, B. Physica A 263, 500 (1999).
in a rugged ‘energy landscape’. How this order in appropriate circumstances; but we 2. Mishima, O. & Stanley, H. E. Nature 396, 329 (1998).
3. Bruce, A. & Wallace, D. in The New Physics (ed. Davies, P.)
landscape is explored as a liquid is lowered cannot prove that they do so rigorously. (Cambridge Univ. Press, 1989).
towards its glass transition is in need of Physicist Elliott Lieb’s comment on this issue 4. London, F. Nature 141, 643 (1938).
further clarification before a truly thermo­ applies equally to many are§as of physics: 5. Bardeen, J., Cooper, L. N. & Schrieffer, J. R. Phys. Rev. 108, 1175
dynamic description of this transition can be “One might ask why one should bother (1957).
6. Anderson, M. H., Ensher, J. R., Matthews, M. R., Wieman, C. E.
developed. The conceptual tools needed for to prove rigorously what is ‘physically & Cornell, E. A. Science 269, 198 (1995).
the job are also being usefully brought to obvious’ and the answer is that ‘Not 7. Bednorz, J. G. & Müller, K. A. Z. Phys. B – Cond. Matt 64, 189­
bear on the vexed issue of protein folding. everything that is obvious is true and the 93 (1986).
8. Landauer, R. Physical Rev. A 12, 636 (1975).
This compares with a glass in the sense that ability to prove something interesting about
9. Cross, M. C. & Hohenberg, P. Rev. Mod. Phys. 65, 851 (1993).
the system of particles (in this case interact­ a model usually requires an additional 10. Bak, P., Tang, C. & Weisenfeld, K. Phys. Rev. Lett. 59, 381
ing residues on the polypeptide chain) has a degree of physical understanding that goes (1987).
great many configurations that correspond beyond intuition; in other words, we learn 11. Rodriguez-Iturbe, I. & Rinaldo, A. Fractal River Basins
(Cambridge Univ. Press, 1997).
to local free-energy minima, but only one — something new about physics’.”15 12. Melo, F., Umbanhowar, P. B. & Swinney, H. L. Phys. Rev. Lett.
the native fold — that equates with the global Such rigour is, however, surely an impos­ 75, 3838 (1995).
minimum. The protein experiences ‘frustra­ sible dream when one comes to apply the 13. Glazier, J. A., Segawa, T., Naert, A. & Sano, M. Nature 398, 307
tion’ as it folds: an amino-acid residue elegance of statistical physics to the orches­ (1999).
14. Wolynes, P. G. & Eaton, W. A. Physics World 12(9), 39 (1999).
cannot simultaneously optimize interac­ trated chaos we call life. Despite the proven 15. Lieb, E. Physica A 263, 491 (1999).
tions with all its neighbours. Mapping value to cell biology of some concepts from 16. Astumian, R. D. Science 276, 917 (1997).
this situation onto the case of frustrated the study of phase transitions — such as the 17. Wiesenfeld, K. & Moss, F. Nature 373, 33 (1995).

C76 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
timescales

Keeping time

How finely can time be measured? The answer might define our capacity to conduct precise tests of general relativity, as demonstrated already in the observed
increase in clock rate with distance from the Earth’s surface, or the measurement of relativistic effects in binary pulsar systems. Spacecraft navigation,
interferometric radio astronomy and Global Positioning System geodesy are all dependent on timekeeping that taxes current technology. And accurate
measurements of the second define other standards, such as the units of distance and voltage.
Measuring time is about monitoring oscillations, whether of the pendulum, the quartz crystal or the caesium atom. By relying on
atoms for our frequency (and hence time) standards, we may at least avoid placing ourselves at the mercy of the machinist who
made the device; but on the other hand, we fall foul of the Uncertainty Principle. Atomic transitions, of which the hyperfine
splitting of energy levels in caesium-133 furnishes us with today’s standard second, happen at a frequency that can
ultimately be well defined only to the degree that the excited state is long-lived. The shorter the lifetime — or the duration
of the measurement — the greater the uncertainty in energy (and frequency) of the transition.
So to get a more accurate result, one needs a more leisurely attitude to measurement. This is the
principle behind the ‘atom fountain’, a device that first cools atoms within an optical ‘molasses’ of three crossed laser
beams before gently launching them vertically so that they fall back under gravity. The atoms can then be probed
outside the perturbing influence of the laser field.
Towards the top of the fountain, the atoms pass twice through a microwave cavity: once on the way up, once on
the way down, separated by about a second. Invented by Norman Ramsey in 1949, this ‘separated oscillatory-field
method’ is applied to a linear atomic beam in current atomic clocks. But the atom fountain permits a much longer span
between pulses, and so a greater narrowing of the transition. In principle these devices offer a frequency determination
accurate to one part in 1016, which is equivalent to a variation of about one second in a billion years. Philip Ball

Circadian clocks
How do organisms respond to their most fundamental timescale — the Earth’s rotation around its own axis? With a
suite of interconnected hormonal and physiological loops that make up the ‘circadian timing system’ or CTS.
Every CTS — whatever the organism — is made up of light receptors, an endogenous ‘biological’ clock and a so-called
‘entrainment’ system to couple one with the other. For although every cell of every biological or circadian clock behaves cyclically,
these intrinsic periods are not necessarily exactly 24 hours long. The clock has to be set.
Much of the pathology and endocrinology of the circadian clock has been elucidated in the past century. In humans, for
instance, many sections of the route from retina to brain to heart-muscle action, metabolism, body temperature and the
organization of sleep–wake cycles have been charted. But there are still major areas of uncertainty, especially in the newest
branch of circadian biology: the identification and study of genes necessary for normal timekeeping.
Clock genes seem to be present in all cells, even though only a subset is capable of acting as self-sustained oscillators.
So do these genes have other, global, non-circadian functions in cell biology? As Russell Foster of Imperial College, London, puts
it: “cellular assays have shown the way forward, now we need good functional analysis using transgenic approaches and
behavioural studies”.
Moreover, only a few animal systems have been studied so far in any detail — the mouse, the fly and the fungus,
Neurospora. This has revealed some remarkable similarities but also some exciting differences. Comparisons of additional
species will surely tell us more about the evolution and adaptive significance of circadian systems and their interactions with
their environments. All of which will hopefully feed back into a fine-tuned understanding of, and ability to treat or even to
manipulate, the one timepiece that we all care about — the human body. Sara Abdulla

The challenge of conservation


Some disciplines move faster than others, but if there is one moving much faster than its practitioners would like,
it is conservation biology. In the next century, conservationists will be less interested in pristine wilderness — for such will
hardly exist — but in the success with which human beings manage the biosphere to the benefit of all its inhabitants, including
humans. The world is currently in an unprecedented situation in which Homo sapiens — just one species out of millions —
sequesters 45 per cent of the planet’s net terrestrial productivity and 55 per cent of its fresh water. Effective biodiversity systems
management will become an imperative if only to maintain the quality of human life, let alone that of other species.
In the coming decades, conservation biology will become a close analysis of the art of compromise, as humanity seeks
to understand the global ecosystem as a participant, not as an observer. “Within a century,” says Peter Raven, director of the
Missouri Botanic Garden, “the world will be a patchwork with varying degrees of biological richness, and the way people
respond locally to challenges will determine the nature of that compromise.”
What will we have lost by 2100? Raven estimates that two-thirds of all species will disappear if current trends
continue, but is nevertheless optimistic. “We can do better,” he says. “No major category of
organisms, or local community of organisms, need disappear completely.”
Ecologists are acutely aware that the maintenance of ecosystem function transcends the
conservation of any particular species, so high-technology initiatives such as ‘ex situ’ conservation
— maintaining banks of germplasm outside the context of a habitat — may become a side issue.
“High technology is unlikely to solve most of the major global environmental issues that we face,”
says David Tilman of the University of Minnesota. “Rather, lifestyle changes, most likely imposed by legislation, will be needed. Such legislation will be socially
sustainable only once citizens understand their long-term dependence on biodiversity and ecosystem services.” And Raven adds that it is “better for now to follow
Aldo Leopold’s dictum: ‘the first rule of intelligent tinkering is to save all the cogs and wheels’”. Henry Gee

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C17
timescales

Physics at the Planck time


Superstring theory, perhaps the most notorious attempt to unify general relativity with quantum mechanics, was once described as a piece of
twenty-first-century physics that fell into the twentieth century. Certainly, physicists struggling to marry these reluctant partners have long sensed that the
crucial piece of the puzzle has yet to be invented. But a theory of quantum gravity remains more a question of tidiness than anything else — quantum effects on space­
time become important only on time and length scales well out of experimental reach. A quantum field theory of electromagnetism has plenty of
practical relevance for applied physics, and such a field theory for the nuclear forces generates no shortage of testable predictions. But
gravity yields to the quantum rod only at the so-called Planck scale: over distances of about 10–33 cm and times of 10–43 s.
A physics to match the Planck timescale is the biggest challenge to physicists in the coming century. This is the
timescale relevant to the graviton, the putative carrier of the gravitational force. And naturally enough, only such a
theory can extend our understanding of the Big Bang inside the first 10–43 seconds of existence.
What will a theory of quantum gravity do to space-time? Will it be like John Wheeler’s ‘quantum foam’, furiously
alive with ephemeral black holes and worm holes? Or will it be more like Abhay Ashtekar’s fabric of woven loops? Will
particles become strings, and will the strings be supersymmetric? How many extra dimensions will turn up, curled out
of sight over the Planck distance?
And will we ever be able to test such a theory? One quantum-gravity researcher has confessed that “it seems
highly unlikely that a machine will ever be built with which these minute distances can be studied directly”. For
guidance, it seems that physicists may have to fall back on the old stand-bys: elegance and beauty. Philip Ball

The speed of computers


One version of Moore’s Law, predicted by Intel co-founder Gordon Moore in 1965, has it that computer chips double in speed every
18 months. Nothing moves any faster, of course — the chips simply become more powerful by packing more devices into the same
space. Speed is about miniaturization. Recent research on the ultimate size limits for silicon technology have fuelled
speculation about whether Moore’s Law is headed for a crash. By 2012, at the current rate of shrinkage, transistors will
have become so small that the silicon oxide films will no longer guarantee adequate insulation between conducting regions: they
become switches that cannot be fully turned off. What happens then to our expectation that the next generation of machines will
be better, meaner and faster?
All manner of factors determine how quickly, say, the Nature web page downloads onto your screen, many of them scarcely
connected to the component density on the chips inside the processor. In some parts of the world it can take ten minutes or more.
Yet you would be waiting a lot longer if the data were streaming down old copper telephone cables rather than glass optical fibres.
Fibre-optic networks have a greater carrying capacity that speeds up transmission by one or two orders of magnitude relative to
electrical lines. But that is only a fraction of what should be possible with photonic technology.
And the electronics at either end are soon to be stretched to their limit to cope with the transmission speeds that photonics
offers. Gigabit-per-second distributed networks have already been demonstrated, and laser diodes can in principle blink out 100
Gbits per second; but electronics lose the ability to cope at half that rate. All-optical networks will then be required, and
networks working at 100 Gbits per second are already in development. At some point, the light will need to go on a
chip, in photonic integrated circuits.
But photonics is an embryonic technology, and plagued with obstacles in comparison with mature electronics. No clear rival
to the transistor has yet emerged from this or any other direction, such as single-electron devices and molecular electronics. So
perhaps we will have to adjust our expectations as the information revolution reaches a plateau. Moore predicts that we have
about two generations of computers left with current technologies, and “beyond that, life gets very interesting”. Philip Ball

Lifespan extension
Eat more fibre! Consume less cholesterol! Drizzle more olive oil! Have fewer children later in life! Drink a glass of red wine a
day! And so it goes on. Rarely a day goes by without the emergence of a new ‘theory’ in that most seductive realm of scientific
advances — lifespan extension.
Chromosome structure hints that our cells have genetic ‘fuses’ that mete out their allotted time. Antibiotics, vaccinations
and antiseptics have brought about hikes in life expectancy in the developed world which have raised the average lifespan to
almost double that at the last fin de siècle — nigh-on 80 years and still rising — as this century draws to a close. The
intractability of several cancers such as Hodgkin’s lymphoma and leukaemia has been tamed and many others look to follow
soon; the hard nut of cognitive decline is on the verge of being cracked with drugs and neuron replacement; and workable
synthetic organs and xenotransplants are around the corner.
But the current obsession with all things genetic, molecular, high-tech and expensive risks missing the point that
both the World Bank and World Health Organization (WHO) make patently clear. That is, for the majority of the
planet the most powerful lifespan-extending technologies of all — good, basic public health and
preventative medicine — are still a far-off dream, even if they are, scientifically speaking,
old hat.
As we usher in the third millennium, the WHO estimates that the citizens of the world’s poorest
countries (who make up two-fifths of the global population) have almost half the life expectancy —
at under 50 years — of those living in the richest nations. And that gap is widening, not narrowing,
often in both directions. One third of the planet’s population do not have secure access to safe water or sufficient food, let alone
drugs, immunizations, or even the simplest health education. Elsewhere, billions of dollars are being spent teasing out the intricacies of gene therapy and memory
enhancement. Addressing this inequity is surely one of the greatest and most pressing challenges facing us all, if not as academics, then as people. Sara Abdulla

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C61
timescales

Flashes in femtoseconds
For the molecule, time ticks in picoseconds, or even femtoseconds — respectively 10–12 and 10–15 s. On such timescales, a molecule tumbles through space, an
atom vibrates back and forth in a crystal, a chemical bond is forged or broken. Yet the development of pulsed lasers blinking in a femtosecond staccato of flashes has
enabled these fundamental molecular processes to be charted, revealing the rise and decline of ephemeral intermediates under the furiously strobed illumination.
Femtosecond spectroscopy has shown us the dynamics of bond breaking — but what of the structural aspects? When will we be able to map out
the coordinates of individual atoms, as, for example, a protein docks onto the cell surface or as a chlorine radical eliminates an ozone
molecule? When will we see the first atomically resolved diffraction pattern of a true transition state, and watch frame by frame as
vibrations carry atoms into new unions?
This demands a marriage of the techniques of ultrafast spectroscopy with those of X-ray diffraction. The probe
beam becomes a pulsed X-ray source, its flashes brief enough to ‘freeze’ the atomic motions yet bright enough to provide a
discernible diffraction pattern. Current synchrotron sources can be pulsed at around 50 picoseconds, whereas ultrashort
laser pulses can excite bursts of X-rays from suitable targets that could, in principle, be over within 100 femtoseconds.
These pulses must be synchronized with an optical pumping beam, which gets the reaction underway. An
alternative is to diffract electrons rather than X-rays. Diffraction on the ‘molecular’ timescale of femtoseconds is an
infant discipline which promises wonders once perfected, but which is capable right now of only the crudest of
impressionistic sketches: blurred images of lattice dynamics, showing evidence of rapid change but without a single
molecule (let alone an atom) in focus. The static photography of the Braggs has yet to produce its first movie. Philip Ball

Arrhythmias
‘His heart raced’. ‘Her heart skipped a beat’. ‘My heart beat sluggishly in my breast.’ Literature is peppered with such lines. Why?
Because, as many a dramatic writer has noted, a disturbance in that most basic ‘rhythm of life’ — the normal human heart rate of
between 60 and 100 beats per minute — is a sign that something is amiss.
Normally a heartbeat starts in the right atrium of the heart, in a specialized group of pacemaker cells known as the sinus node.
This node sends an electrical signal, which spreads through the atria and ventricles of the heart, following a precise route that
induces the heart to beat in the most efficient way.
If the sinus node develops an abnormal rate or rhythm, if the conduction pathway of its electrical signals is faulty or if another
part of the heart takes over as the pacemaker, arrhythmias — as the various types of irregular heartbeat are generically known
— can ensue. Arrhythmias may be due to heart disease, ageing, medications, metabolic imbalances or other genetic or infectious
medical problems. And the ineffective blood pumping that they give rise to can cause fatigue, dizziness, light-headedness, fainting
spells and, ultimately, strokes and death.
Oscillations in the cells of the sinus node operate by setting up a cyclic decay of the difference between the charge on the
outside of the membrane and the inside. This causes the membrane to fire spontaneously, which raises its conductance and allows
electrolytes to rush in, thus reversing the potential across the membrane. This reversed charge propagates to neighbouring cells,
causing the same sequence of events therein and hence a concert of muscular contraction.
But it is currently unclear exactly how cyclically expressed genes and proteins give rise to cyclic decay in
membrane potential in the heart’s muscle cells. So, just as in many other areas of medical biology, the challenge now is
twofold. First, to expand our knowledge of the genetics and cell signalling processes that gives rise to healthy heartbeats. And
second, to integrate this into a far richer picture of the workings, and failings, of the whole heart. Sara Abdulla

Does the past have a future?


Hardly a week goes by without the discovery of some amazing fossil that will change the way we look at the history of life. But
how long can the fossil bonanza last? After all, the Earth has only so many rocks to investigate. Will there come a time
when we can say that the fossil record is substantially complete, in that the likelihood of discovering radically new forms, or
significant range extensions of known forms, becomes improbable? If so, when will that time arrive? And when it does, how will
this influence the practice of palaeontologists?
“I think we are more or less at that point now,” says Andrew Smith of the Natural History Museum in London. “The
available outcrops are pretty heavily searched within the developed nations, and major headway has been made in exploring
places like Mongolia and Madagascar.” His prognosis is, however, tempered by the fact that absolute completeness is an
impossibility, because the vast majority of organisms do not end up as fossils: “my belief is that we will have a
substantially biased fossil record that cannot be bettered no matter where we go in the world”.
Charles Marshall of Harvard University agrees that the Earth’s fossil riches have been fairly well worked out, but says
that one cannot discern a point when the last fossil will have been extracted. “One can’t simply extrapolate from what
has been discovered because radical new finds often result from unanticipated ways of looking, often in stratigraphic
intervals or rock types not typically examined. If the rate of recent discoveries, from fossil embryos to down­
covered dinosaurs, are any indication there is much to be discovered yet!”
However, contrary to popular iconography, most palaeontologists do not devote most of their
time to making radical discoveries. Such discoveries, says Marshall, will be — as they are now —
“largely the unexpected by-products of mature research programmes designed to establish the
patterns of evolution and elucidate the processes responsible for those patterns.”
“The obvious priority,” says Smith, “will be to establish and understand how sequence stratigraphy and sea-level change
control facies preservation, and thus control apparent biotic diversity and extinction patterns through time. Only then will we begin to get a reliable handle on how
diversity has changed over time.” Henry Gee

C30 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com
timescales

Adapting to climate change


Legislating for the present on the basis of predictions for the future can never be other than a dicey business. The suggestion by a British member of parliament in the
nineteenth century that London would be waste-high in horse excrement by the 1950s could have been seen as a call for crippling taxes on Hansom cabs, pushing
cabbies out of trade while still leaving us choking on exhaust fumes. So should we now be legislating against those exhaust fumes, in fear that by the
end of the next century the world will be a flood- and storm-prone hothouse?
This was one of the key issues highlighted by the Kyoto convention on climate change in 1997. It is all very well to set targets
for emissions reductions on the basis of climate models projected over the next century (and then in all probability to fall short of
attaining them anyway), but what of the costs of fierce constraints to today’s populations, particularly those who live within
fragile economies?
Yet clearly we have to do something. There seems to be no serious doubt now that global warming, induced by
emission of greenhouse gases through human activity, is upon us. The global average temperature has risen by
around half a degree Celsius during the twentieth century, and is predicted to increase by a further 1.0–3.5 ºC by
2100. Over the same periods, the seas have risen by 10–25 cm and we can expect almost a metre more at the
upper limit. Storms and drought may or may not get more common, but their distribution will probably alter.
Massive famine and flooding cannot be ruled out.
As it happens, the Kyoto targets may make very little impact on all of this, but the basic question remains:
how can we plan for social change on timescales over which forecasts of enabling technologies become unreliable?

One approach that has been advocated is to formulate policy based on assessments of human and social impacts rather

than in terms of national emissions reduction targets — something that is harder to quantify but closer to people’s concerns.

Emissions are not the whole story; what about economic efforts to make societies vulnerable to climate
change less so? And how far do political and industrial structures need re-engineering to make actual attainment of objectives
realistic?
Climate modellers, meanwhile, are trying to incorporate some element of the social and economic contexts into
their forecasts. They grapple with the question of whether it is better, given the uncertainties, to brake hard on emissions now,
do nothing until later on the assumption that we will then be much more technologically capable of it (owing to enhanced energy
efficiency and so forth), or something in between. Inevitably, such modelling is painfully contentious and strongly influenced by
the assumptions that go into it. We are juggling with the unknown; and in the face of massive inertia and sharp political lobbying,
that is a precarious occupation. Philip Ball

The shape of the cosmos


Olaf Stapledon is the great unsung hero of twentieth-century science fiction. His stories are cast on scales of time and
space so vast as to inspire nothing short of vertigo or terror. In Star Maker, his hero, in a ‘hawk-flight of the imagination’,
journeys through the cosmos and eventually meets the Creator in his workshop. The Star Maker, it seems, is still perfecting
his art. Our own cosmos is only provisional. Previous essays in Universe construction litter the room like trash. These essays
come in all shapes and sizes, suggesting that we need not live in a great, expanding hypersphere, infinite in all directions.
Our conception of the shape and size of the Universe is based on timescales. The familiar ‘light-year’ is a unit
based on the time it takes for a beam of light to get from A to B. But there is a reckoning that is purely geometric, or,
rather, trigonometric. This is the ‘parsec’, short for ‘parallax arcsecond’, the distance at which the separation of the Earth
and Sun (1 Astronomical Unit) would subtend an angle of one second of arc. A parsec is equivalent to just over three
light-years.
But the dimensions of the cosmos are reckoned in megaparsecs — millions of parsecs. At the cosmic scale, the
uncertainty over the rate at which the Universe is expanding means that time, distance and geometry are
inextricably bound together: the Hubble flow is meted out in units of kilometres per second per megaparsec. On the very
largest scales, time either loses all meaning or becomes subservient to geometry. Our understanding of time may depend on
knowing the shape of the cosmos — the shape on the Star Maker’s workbench.
In 2002, we may know. By then, we should have results from detailed maps of fluctuations in the cosmic microwave
background (CMB) from two spacecraft: NASA’s Microwave Anisotropy Probe and the European Space Agency’s Planck
Surveyor. The results will help constrain values for important cosmological parameters such as such as the density of matter in
the Universe, and the Hubble constant that provides a scale to its expansion. Accurate knowledge of these parameters should,
in turn, constrain ideas about the geometry of space — whether it is ‘flat’ or ‘Euclidean’, or if it has positive (that is, spherical)
or negative (hyperbolic) curvature.
The results will also be able to tell us whether space is infinite in all directions, or is geometrically finite.
If it is finite, then some extremely weird things could fall out of the analysis. In a finite Universe, surveys of deep space
could be pulling out multiple images of the same finite set of galaxies — like an observer in a hall of mirrors. One
of those faraway galaxies could be our own, its light having circumnavigated the cosmos to reach us. Study
of the CMB will be able to tell us whether space really is a hall of mirrors, and, if it is, something of the
hall’s architecture. The Universe could be an analogue of a simple sphere. On the other hand, it could
have some more exotic shape, akin to a torus, or another of those shapes littering the Star Maker’s
garage floor. Henry Gee

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C79
timescales

Keeping time

How finely can time be measured? The answer might define our capacity to conduct precise tests of general relativity, as demonstrated already in the observed
increase in clock rate with distance from the Earth’s surface, or the measurement of relativistic effects in binary pulsar systems. Spacecraft navigation,
interferometric radio astronomy and Global Positioning System geodesy are all dependent on timekeeping that taxes current technology. And accurate
measurements of the second define other standards, such as the units of distance and voltage.
Measuring time is about monitoring oscillations, whether of the pendulum, the quartz crystal or the caesium atom. By relying on
atoms for our frequency (and hence time) standards, we may at least avoid placing ourselves at the mercy of the machinist who
made the device; but on the other hand, we fall foul of the Uncertainty Principle. Atomic transitions, of which the hyperfine
splitting of energy levels in caesium-133 furnishes us with today’s standard second, happen at a frequency that can
ultimately be well defined only to the degree that the excited state is long-lived. The shorter the lifetime — or the duration
of the measurement — the greater the uncertainty in energy (and frequency) of the transition.
So to get a more accurate result, one needs a more leisurely attitude to measurement. This is the
principle behind the ‘atom fountain’, a device that first cools atoms within an optical ‘molasses’ of three crossed laser
beams before gently launching them vertically so that they fall back under gravity. The atoms can then be probed
outside the perturbing influence of the laser field.
Towards the top of the fountain, the atoms pass twice through a microwave cavity: once on the way up, once on
the way down, separated by about a second. Invented by Norman Ramsey in 1949, this ‘separated oscillatory-field
method’ is applied to a linear atomic beam in current atomic clocks. But the atom fountain permits a much longer span
between pulses, and so a greater narrowing of the transition. In principle these devices offer a frequency determination
accurate to one part in 1016, which is equivalent to a variation of about one second in a billion years. Philip Ball

Circadian clocks
How do organisms respond to their most fundamental timescale — the Earth’s rotation around its own axis? With a
suite of interconnected hormonal and physiological loops that make up the ‘circadian timing system’ or CTS.
Every CTS — whatever the organism — is made up of light receptors, an endogenous ‘biological’ clock and a so-called
‘entrainment’ system to couple one with the other. For although every cell of every biological or circadian clock behaves cyclically,
these intrinsic periods are not necessarily exactly 24 hours long. The clock has to be set.
Much of the pathology and endocrinology of the circadian clock has been elucidated in the past century. In humans, for
instance, many sections of the route from retina to brain to heart-muscle action, metabolism, body temperature and the
organization of sleep–wake cycles have been charted. But there are still major areas of uncertainty, especially in the newest
branch of circadian biology: the identification and study of genes necessary for normal timekeeping.
Clock genes seem to be present in all cells, even though only a subset is capable of acting as self-sustained oscillators.
So do these genes have other, global, non-circadian functions in cell biology? As Russell Foster of Imperial College, London, puts
it: “cellular assays have shown the way forward, now we need good functional analysis using transgenic approaches and
behavioural studies”.
Moreover, only a few animal systems have been studied so far in any detail — the mouse, the fly and the fungus,
Neurospora. This has revealed some remarkable similarities but also some exciting differences. Comparisons of additional
species will surely tell us more about the evolution and adaptive significance of circadian systems and their interactions with
their environments. All of which will hopefully feed back into a fine-tuned understanding of, and ability to treat or even to
manipulate, the one timepiece that we all care about — the human body. Sara Abdulla

The challenge of conservation


Some disciplines move faster than others, but if there is one moving much faster than its practitioners would like,
it is conservation biology. In the next century, conservationists will be less interested in pristine wilderness — for such will
hardly exist — but in the success with which human beings manage the biosphere to the benefit of all its inhabitants, including
humans. The world is currently in an unprecedented situation in which Homo sapiens — just one species out of millions —
sequesters 45 per cent of the planet’s net terrestrial productivity and 55 per cent of its fresh water. Effective biodiversity systems
management will become an imperative if only to maintain the quality of human life, let alone that of other species.
In the coming decades, conservation biology will become a close analysis of the art of compromise, as humanity seeks
to understand the global ecosystem as a participant, not as an observer. “Within a century,” says Peter Raven, director of the
Missouri Botanic Garden, “the world will be a patchwork with varying degrees of biological richness, and the way people
respond locally to challenges will determine the nature of that compromise.”
What will we have lost by 2100? Raven estimates that two-thirds of all species will disappear if current trends
continue, but is nevertheless optimistic. “We can do better,” he says. “No major category of
organisms, or local community of organisms, need disappear completely.”
Ecologists are acutely aware that the maintenance of ecosystem function transcends the
conservation of any particular species, so high-technology initiatives such as ‘ex situ’ conservation
— maintaining banks of germplasm outside the context of a habitat — may become a side issue.
“High technology is unlikely to solve most of the major global environmental issues that we face,”
says David Tilman of the University of Minnesota. “Rather, lifestyle changes, most likely imposed by legislation, will be needed. Such legislation will be socially
sustainable only once citizens understand their long-term dependence on biodiversity and ecosystem services.” And Raven adds that it is “better for now to follow
Aldo Leopold’s dictum: ‘the first rule of intelligent tinkering is to save all the cogs and wheels’”. Henry Gee

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C17
timescales

Physics at the Planck time


Superstring theory, perhaps the most notorious attempt to unify general relativity with quantum mechanics, was once described as a piece of
twenty-first-century physics that fell into the twentieth century. Certainly, physicists struggling to marry these reluctant partners have long sensed that the
crucial piece of the puzzle has yet to be invented. But a theory of quantum gravity remains more a question of tidiness than anything else — quantum effects on space­
time become important only on time and length scales well out of experimental reach. A quantum field theory of electromagnetism has plenty of
practical relevance for applied physics, and such a field theory for the nuclear forces generates no shortage of testable predictions. But
gravity yields to the quantum rod only at the so-called Planck scale: over distances of about 10–33 cm and times of 10–43 s.
A physics to match the Planck timescale is the biggest challenge to physicists in the coming century. This is the
timescale relevant to the graviton, the putative carrier of the gravitational force. And naturally enough, only such a
theory can extend our understanding of the Big Bang inside the first 10–43 seconds of existence.
What will a theory of quantum gravity do to space-time? Will it be like John Wheeler’s ‘quantum foam’, furiously
alive with ephemeral black holes and worm holes? Or will it be more like Abhay Ashtekar’s fabric of woven loops? Will
particles become strings, and will the strings be supersymmetric? How many extra dimensions will turn up, curled out
of sight over the Planck distance?
And will we ever be able to test such a theory? One quantum-gravity researcher has confessed that “it seems
highly unlikely that a machine will ever be built with which these minute distances can be studied directly”. For
guidance, it seems that physicists may have to fall back on the old stand-bys: elegance and beauty. Philip Ball

The speed of computers


One version of Moore’s Law, predicted by Intel co-founder Gordon Moore in 1965, has it that computer chips double in speed every
18 months. Nothing moves any faster, of course — the chips simply become more powerful by packing more devices into the same
space. Speed is about miniaturization. Recent research on the ultimate size limits for silicon technology have fuelled
speculation about whether Moore’s Law is headed for a crash. By 2012, at the current rate of shrinkage, transistors will
have become so small that the silicon oxide films will no longer guarantee adequate insulation between conducting regions: they
become switches that cannot be fully turned off. What happens then to our expectation that the next generation of machines will
be better, meaner and faster?
All manner of factors determine how quickly, say, the Nature web page downloads onto your screen, many of them scarcely
connected to the component density on the chips inside the processor. In some parts of the world it can take ten minutes or more.
Yet you would be waiting a lot longer if the data were streaming down old copper telephone cables rather than glass optical fibres.
Fibre-optic networks have a greater carrying capacity that speeds up transmission by one or two orders of magnitude relative to
electrical lines. But that is only a fraction of what should be possible with photonic technology.
And the electronics at either end are soon to be stretched to their limit to cope with the transmission speeds that photonics
offers. Gigabit-per-second distributed networks have already been demonstrated, and laser diodes can in principle blink out 100
Gbits per second; but electronics lose the ability to cope at half that rate. All-optical networks will then be required, and
networks working at 100 Gbits per second are already in development. At some point, the light will need to go on a
chip, in photonic integrated circuits.
But photonics is an embryonic technology, and plagued with obstacles in comparison with mature electronics. No clear rival
to the transistor has yet emerged from this or any other direction, such as single-electron devices and molecular electronics. So
perhaps we will have to adjust our expectations as the information revolution reaches a plateau. Moore predicts that we have
about two generations of computers left with current technologies, and “beyond that, life gets very interesting”. Philip Ball

Lifespan extension
Eat more fibre! Consume less cholesterol! Drizzle more olive oil! Have fewer children later in life! Drink a glass of red wine a
day! And so it goes on. Rarely a day goes by without the emergence of a new ‘theory’ in that most seductive realm of scientific
advances — lifespan extension.
Chromosome structure hints that our cells have genetic ‘fuses’ that mete out their allotted time. Antibiotics, vaccinations
and antiseptics have brought about hikes in life expectancy in the developed world which have raised the average lifespan to
almost double that at the last fin de siècle — nigh-on 80 years and still rising — as this century draws to a close. The
intractability of several cancers such as Hodgkin’s lymphoma and leukaemia has been tamed and many others look to follow
soon; the hard nut of cognitive decline is on the verge of being cracked with drugs and neuron replacement; and workable
synthetic organs and xenotransplants are around the corner.
But the current obsession with all things genetic, molecular, high-tech and expensive risks missing the point that
both the World Bank and World Health Organization (WHO) make patently clear. That is, for the majority of the
planet the most powerful lifespan-extending technologies of all — good, basic public health and
preventative medicine — are still a far-off dream, even if they are, scientifically speaking,
old hat.
As we usher in the third millennium, the WHO estimates that the citizens of the world’s poorest
countries (who make up two-fifths of the global population) have almost half the life expectancy —
at under 50 years — of those living in the richest nations. And that gap is widening, not narrowing,
often in both directions. One third of the planet’s population do not have secure access to safe water or sufficient food, let alone
drugs, immunizations, or even the simplest health education. Elsewhere, billions of dollars are being spent teasing out the intricacies of gene therapy and memory
enhancement. Addressing this inequity is surely one of the greatest and most pressing challenges facing us all, if not as academics, then as people. Sara Abdulla

NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com © 1999 Macmillan Magazines Ltd C61
timescales

Flashes in femtoseconds
For the molecule, time ticks in picoseconds, or even femtoseconds — respectively 10–12 and 10–15 s. On such timescales, a molecule tumbles through space, an
atom vibrates back and forth in a crystal, a chemical bond is forged or broken. Yet the development of pulsed lasers blinking in a femtosecond staccato of flashes has
enabled these fundamental molecular processes to be charted, revealing the rise and decline of ephemeral intermediates under the furiously strobed illumination.
Femtosecond spectroscopy has shown us the dynamics of bond breaking — but what of the structural aspects? When will we be able to map out
the coordinates of individual atoms, as, for example, a protein docks onto the cell surface or as a chlorine radical eliminates an ozone
molecule? When will we see the first atomically resolved diffraction pattern of a true transition state, and watch frame by frame as
vibrations carry atoms into new unions?
This demands a marriage of the techniques of ultrafast spectroscopy with those of X-ray diffraction. The probe
beam becomes a pulsed X-ray source, its flashes brief enough to ‘freeze’ the atomic motions yet bright enough to provide a
discernible diffraction pattern. Current synchrotron sources can be pulsed at around 50 picoseconds, whereas ultrashort
laser pulses can excite bursts of X-rays from suitable targets that could, in principle, be over within 100 femtoseconds.
These pulses must be synchronized with an optical pumping beam, which gets the reaction underway. An
alternative is to diffract electrons rather than X-rays. Diffraction on the ‘molecular’ timescale of femtoseconds is an
infant discipline which promises wonders once perfected, but which is capable right now of only the crudest of
impressionistic sketches: blurred images of lattice dynamics, showing evidence of rapid change but without a single
molecule (let alone an atom) in focus. The static photography of the Braggs has yet to produce its first movie. Philip Ball

Arrhythmias
‘His heart raced’. ‘Her heart skipped a beat’. ‘My heart beat sluggishly in my breast.’ Literature is peppered with such lines. Why?
Because, as many a dramatic writer has noted, a disturbance in that most basic ‘rhythm of life’ — the normal human heart rate of
between 60 and 100 beats per minute — is a sign that something is amiss.
Normally a heartbeat starts in the right atrium of the heart, in a specialized group of pacemaker cells known as the sinus node.
This node sends an electrical signal, which spreads through the atria and ventricles of the heart, following a precise route that
induces the heart to beat in the most efficient way.
If the sinus node develops an abnormal rate or rhythm, if the conduction pathway of its electrical signals is faulty or if another
part of the heart takes over as the pacemaker, arrhythmias — as the various types of irregular heartbeat are generically known
— can ensue. Arrhythmias may be due to heart disease, ageing, medications, metabolic imbalances or other genetic or infectious
medical problems. And the ineffective blood pumping that they give rise to can cause fatigue, dizziness, light-headedness, fainting
spells and, ultimately, strokes and death.
Oscillations in the cells of the sinus node operate by setting up a cyclic decay of the difference between the charge on the
outside of the membrane and the inside. This causes the membrane to fire spontaneously, which raises its conductance and allows
electrolytes to rush in, thus reversing the potential across the membrane. This reversed charge propagates to neighbouring cells,
causing the same sequence of events therein and hence a concert of muscular contraction.
But it is currently unclear exactly how cyclically expressed genes and proteins give rise to cyclic decay in
membrane potential in the heart’s muscle cells. So, just as in many other areas of medical biology, the challenge now is
twofold. First, to expand our knowledge of the genetics and cell signalling processes that gives rise to healthy heartbeats. And
second, to integrate this into a far richer picture of the workings, and failings, of the whole heart. Sara Abdulla

Does the past have a future?


Hardly a week goes by without the discovery of some amazing fossil that will change the way we look at the history of life. But
how long can the fossil bonanza last? After all, the Earth has only so many rocks to investigate. Will there come a time
when we can say that the fossil record is substantially complete, in that the likelihood of discovering radically new forms, or
significant range extensions of known forms, becomes improbable? If so, when will that time arrive? And when it does, how will
this influence the practice of palaeontologists?
“I think we are more or less at that point now,” says Andrew Smith of the Natural History Museum in London. “The
available outcrops are pretty heavily searched within the developed nations, and major headway has been made in exploring
places like Mongolia and Madagascar.” His prognosis is, however, tempered by the fact that absolute completeness is an
impossibility, because the vast majority of organisms do not end up as fossils: “my belief is that we will have a
substantially biased fossil record that cannot be bettered no matter where we go in the world”.
Charles Marshall of Harvard University agrees that the Earth’s fossil riches have been fairly well worked out, but says
that one cannot discern a point when the last fossil will have been extracted. “One can’t simply extrapolate from what
has been discovered because radical new finds often result from unanticipated ways of looking, often in stratigraphic
intervals or rock types not typically examined. If the rate of recent discoveries, from fossil embryos to down­
covered dinosaurs, are any indication there is much to be discovered yet!”
However, contrary to popular iconography, most palaeontologists do not devote most of their
time to making radical discoveries. Such discoveries, says Marshall, will be — as they are now —
“largely the unexpected by-products of mature research programmes designed to establish the
patterns of evolution and elucidate the processes responsible for those patterns.”
“The obvious priority,” says Smith, “will be to establish and understand how sequence stratigraphy and sea-level change
control facies preservation, and thus control apparent biotic diversity and extinction patterns through time. Only then will we begin to get a reliable handle on how
diversity has changed over time.” Henry Gee

C30 © 1999 Macmillan Magazines Ltd NATURE | VOL 402 | SUPP | 2 DECEMBER 1999 | www.nature.com

You might also like