Professional Documents
Culture Documents
Statistical Seismology: Ó Birkha User Verlag, Basel, 2005
Statistical Seismology: Ó Birkha User Verlag, Basel, 2005
Statistical Seismology: Ó Birkha User Verlag, Basel, 2005
Statistical Seismology
DAVID VERE-JONES,1 YEHUDA BEN-ZION,2 and RAMÓN ZÚÑIGA3
Introduction
1
Victoria University of Wellington, P.O. Box 600 Wellington, New Zealand
2
Department of Earth Sciences, University of Southern California, Los Angeles, CA, 90089-0740,
USA
3
Centro de Geociencias, Campus Juriquilla, UNAM, Apdo. Postal 1-742, Centro, Querétaro, Qro,
76001, México
1024 D. Vere-Jones et al. Pure appl. geophys.,
and geodetic data. The many variations on block and slider models provide a similar
function. While such models confirm important qualitative aspects of seismological
data, they also are difficult to fit to specific data sets, or to use as the basis for
quantitative predictions.
From the statistical side, an important impulse has been the development of the
theory of stochastic point processes, that is, processes whose realizations can be
represented as a family of delta-functions in space or time or both. Practical models
for space-time point processes date largely from the 1980s. Such models allow at least
some elements of the physical processes to be incorporated into the conditional
intensity function which controls the evolution of the point process. Ogata’s ETAS
model is an important example. In effect, it represents the seismicity as a
nonstationary Poisson process in which the instantaneous rate, or intensity, is
modulated by the past of the process itself, and perhaps of other processes evolving
with it. Such a representation opens up a route both to incorporating at least
rudimentary physical ideas, and to fitting, testing and simulating the models.
A powerful stimulus to both physical and statistical sides has been the creation of
enormously improved and extended data sources. These include especially the much
enhanced catalogues of earthquakes and microearthquakes, the data on broadband
seismology now coming into existence, and the more detailed data on surface
deformation becoming available from GPS measurements and other techniques.
High quality and extensive observations demand matching techniques of modeling
and data analysis, to assist in their interpretation and effective use. It would be overly
optimistic, however, to expect that off-the-shelf statistical tools could be directly
applied to such new types of data. Work on these aspects is still at a beginning stage.
Many of these issues come to a head in developing models for earthquake
prediction. Here neither physical nor statistical inputs can be ignored. It is becoming
ever clearer that the end-point of such studies is more likely to be some kind of time-
evolving map of spatially varying earthquake hazard than the explicit prediction of
individual events. The development of effective procedures of this kind may well
require a fusion of statistical and physical ideas in ways which differ radically from
standard practice in either field.
The papers in this volume have been grouped into two main categories: papers
that address modeling and methodological issues, and papers that are primarily
observational studies, but make use of statistical models to help interpret the data or
use it for prediction and other purposes. Efforts to model the observed complex
earthquake phenomenology must account for considerable uncertainty and ‘‘igno-
rance.’’ This is typically done using either stochastic dynamics with a few variables or
deterministic dynamics in a highly heterogeneous system with many degrees-of-
freedom. The theoretical papers in this volume provide examples of works from both
categories of models. The paper by Zöller et al. uses a dislocation model to study the
effects of fault heterogeneities on various observables, and to examine the relation
between the degree of heterogeneity and criticality. Damage rheology models can
Vol. 162, 2005 Statistical Seismology 1025
the method to the Japan area. Zhuang et al. analyze data provided by the China
Seismological Bureau on low-frequency electrical signals as potential earthquake
precursors. Such signals have formed part of the day-to-day prediction program in
China for several decades. The accumulated data used in this study is available from
the Web, and readers may decide for themselves whether the data and their analyses
warrant a re-examination of this phenomenon.
In most of the analyses presented, the problems demand a two-way interaction of
seismological and statistical expertise; there are many pitfalls, many unresolved
issues, and the entire field is in a state of rapid development. One of the historical
roles for geophysics has been in pioneering new techniques for statistical analysis and
data processing. All the indications are that this role is in a process of renewal with
statistical seismology. This makes the field at present unusually exciting and
challenging for both geophysicists and statisticians.
Acknowledgements
We would like to thank the participants of the Workshop and the authors of the
papers in this volume for their contributions and cooperation. We would also like to
acknowledge our indebtedness to the National Autonomous University of Mexico
(UNAM) for its support of the Workshop, and to the various granting agencies
whose support facilitated attendance at the Workshop. Finally, we are grateful to the
referees for providing careful reviews of the papers. These include Mark Bebbington,
Nick Beeler, Yuh-Ing Chen, Rudolfo Console, Karen Felzer, Matt Gerstenberger,
Susanna Gross, David Harte, Agnes Helmstetter, Masajiro Imoto, Steve Johnston,
Vladimir Kossobokov, Cinna Lomnitz, Andy Michael, Franceso Mulargia, Yosi
Ogata, David Rhoades, Russell Robinson, Renata Rotondi, Martha Savage, Rick
Schoenberg, Warwick Smith, Ilya Zaliapin, Jiancang Zhuang.