Statistical Seismology: Ó Birkha User Verlag, Basel, 2005

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Ó Birkhäuser Verlag, Basel, 2005

Pure appl. geophys. 162 (2005) 1023–1026


0033 – 4553/05/071023 – 4 Pure and Applied Geophysics
DOI 10.1007/s00024-004-2659-2

Statistical Seismology
DAVID VERE-JONES,1 YEHUDA BEN-ZION,2 and RAMÓN ZÚÑIGA3

Introduction

Stochastic models with an increasing component of physical reasoning have been


slowly gaining acceptance over the last two decades. The subject of statistical
seismology aims to bridge the gap between physics-based models without statistics,
and statistics-based models without physics. This volume, which is based largely on
papers presented at the 3rd International Workshop on Statistical Seismology, held
in Juriquilla, Mexico in May 2003, may serve to illustrate the range of issues now
coming under the statistical seismology heading. While the papers presented may not
solve the problem of bridging the gap, they indicate routes by which it is being
approached.
Several driving forces can be recognized behind the development of statistical
seismology. One is idealistic in character, and based on attempts to emulate for
seismology the success of classical models in continuum mechanics, statistical
mechanics and turbulence theory. Although progress has been made, seismology has
proved a difficult subject in which to make a breakthrough. Appealing analogies such
as the approach to a phase change can be made, and general models such as self-
organized critically can provide valuable insights into the processes at work.
However, converting such broad insights into tractable explicit models, that can be
fitted to a specific data set and used to provide quantitative predictions, is an
altogether more difficult issue, and only partially resolved.
On the physical side, there is a long history of attempts to model the processes of
rupture and fault propagation. Earthquake models that represent faults as
dislocation surfaces in a surrounding elastic solid capture important aspects of the
process, as is evident by the success of such models to fit a variety of seismological

1
Victoria University of Wellington, P.O. Box 600 Wellington, New Zealand
2
Department of Earth Sciences, University of Southern California, Los Angeles, CA, 90089-0740,
USA
3
Centro de Geociencias, Campus Juriquilla, UNAM, Apdo. Postal 1-742, Centro, Querétaro, Qro,
76001, México
1024 D. Vere-Jones et al. Pure appl. geophys.,

and geodetic data. The many variations on block and slider models provide a similar
function. While such models confirm important qualitative aspects of seismological
data, they also are difficult to fit to specific data sets, or to use as the basis for
quantitative predictions.
From the statistical side, an important impulse has been the development of the
theory of stochastic point processes, that is, processes whose realizations can be
represented as a family of delta-functions in space or time or both. Practical models
for space-time point processes date largely from the 1980s. Such models allow at least
some elements of the physical processes to be incorporated into the conditional
intensity function which controls the evolution of the point process. Ogata’s ETAS
model is an important example. In effect, it represents the seismicity as a
nonstationary Poisson process in which the instantaneous rate, or intensity, is
modulated by the past of the process itself, and perhaps of other processes evolving
with it. Such a representation opens up a route both to incorporating at least
rudimentary physical ideas, and to fitting, testing and simulating the models.
A powerful stimulus to both physical and statistical sides has been the creation of
enormously improved and extended data sources. These include especially the much
enhanced catalogues of earthquakes and microearthquakes, the data on broadband
seismology now coming into existence, and the more detailed data on surface
deformation becoming available from GPS measurements and other techniques.
High quality and extensive observations demand matching techniques of modeling
and data analysis, to assist in their interpretation and effective use. It would be overly
optimistic, however, to expect that off-the-shelf statistical tools could be directly
applied to such new types of data. Work on these aspects is still at a beginning stage.
Many of these issues come to a head in developing models for earthquake
prediction. Here neither physical nor statistical inputs can be ignored. It is becoming
ever clearer that the end-point of such studies is more likely to be some kind of time-
evolving map of spatially varying earthquake hazard than the explicit prediction of
individual events. The development of effective procedures of this kind may well
require a fusion of statistical and physical ideas in ways which differ radically from
standard practice in either field.
The papers in this volume have been grouped into two main categories: papers
that address modeling and methodological issues, and papers that are primarily
observational studies, but make use of statistical models to help interpret the data or
use it for prediction and other purposes. Efforts to model the observed complex
earthquake phenomenology must account for considerable uncertainty and ‘‘igno-
rance.’’ This is typically done using either stochastic dynamics with a few variables or
deterministic dynamics in a highly heterogeneous system with many degrees-of-
freedom. The theoretical papers in this volume provide examples of works from both
categories of models. The paper by Zöller et al. uses a dislocation model to study the
effects of fault heterogeneities on various observables, and to examine the relation
between the degree of heterogeneity and criticality. Damage rheology models can
Vol. 162, 2005 Statistical Seismology 1025

account for evolving distributed cracking during irreversible brittle deformation.


Shcherbakov et al. combine the Gutenberg-Richter, modified Omori and Båth laws
into a single equation for aftershocks, and derive theoretical relations based on
damage rheology that are compatible with observed features of aftershocks. Libicki
and Ben-Zion use geometric stochastic branching to simulate volumetric fault
structures, and estimate the fractal dimensions of various subsets of the simulated
structures.
Also within this first group are papers of a more theoretical statistical character.
Several papers make important use of the ETAS model, which is coming to play an
increasingly important role as a base-line model because of its built-in clustering; a
feature which makes it considerably more realistic than the Poisson model. It is
important to understand when the observed patterns of earthquake occurrence can
be accounted for by such a standard model, and when they depart significantly from
it. Saichev et al. derive the asymptotic distributions of the total number of
aftershocks and the number of generations of aftershocks for seismicity governed by
the ETAS model. Molchan examines the distribution of interevent intervals under a
number of basic cluster models, in an attempt to place in perspective recent claims of
a universal form for this distribution. Marsan and Nalbant discuss various
techniques for quantifying changes in seismicity rates, including use of the ETAS
model, with an emphasis on rate reduction in areas of ‘‘stress shadow,’’ and analyze
seismicity rate changes in the eastern California shear zone. Heavy tailed distribu-
tions, which are common in geophysics among other fields, are studied by Zaliapin et
al. who provide approximations for the sums of independent random variables with a
common Pareto distribution and discuss their accuracy and applicability. Harte and
Vere-Jones review aspects of scoring probability forecasts with emphasis on the
entropy or likelihood score. Chen and Huang raise the problem of providing initial
estimates of deaths or economic disruption from earthquakes occurring in regions
where only rudimentary information on buildings and economic activity is available.
Papers in the second category cover a wide range of phenomena, with the
underlying aim of finding quantitative links between the observed features and
earthquake occurrence. The recent work of Rhoades and Evison on developing a
forecasting model which builds in the idea of self-similarity is represented in this
volume by their analysis of Japanese earthquake data; their model is one of the first
to combine self-similarity within an explicit stochastic model that can be fitted to
data. Iwata and Young return to the problem of seismicity induced by tidal stresses
in an analysis of b-values for acoustic emissions monitored at the Underground
Research Laboratory in Canada. Imoto looks at a potential foreshock model for
large earthquakes off Tohoku in northern Japan, while Matsu’ura and Karakama
present a detailed analysis of the Matsushiro earthquake swarm, linking changes in
the parameters of an Omori-type model to changes in the water-flow regime beneath
the swarm area. Nava et al. develop a model based on the Markov process and
observed earthquake patterns to evaluate the seismic hazard in a region, and apply
1026 D. Vere-Jones et al. Pure appl. geophys.,

the method to the Japan area. Zhuang et al. analyze data provided by the China
Seismological Bureau on low-frequency electrical signals as potential earthquake
precursors. Such signals have formed part of the day-to-day prediction program in
China for several decades. The accumulated data used in this study is available from
the Web, and readers may decide for themselves whether the data and their analyses
warrant a re-examination of this phenomenon.
In most of the analyses presented, the problems demand a two-way interaction of
seismological and statistical expertise; there are many pitfalls, many unresolved
issues, and the entire field is in a state of rapid development. One of the historical
roles for geophysics has been in pioneering new techniques for statistical analysis and
data processing. All the indications are that this role is in a process of renewal with
statistical seismology. This makes the field at present unusually exciting and
challenging for both geophysicists and statisticians.

Acknowledgements

We would like to thank the participants of the Workshop and the authors of the
papers in this volume for their contributions and cooperation. We would also like to
acknowledge our indebtedness to the National Autonomous University of Mexico
(UNAM) for its support of the Workshop, and to the various granting agencies
whose support facilitated attendance at the Workshop. Finally, we are grateful to the
referees for providing careful reviews of the papers. These include Mark Bebbington,
Nick Beeler, Yuh-Ing Chen, Rudolfo Console, Karen Felzer, Matt Gerstenberger,
Susanna Gross, David Harte, Agnes Helmstetter, Masajiro Imoto, Steve Johnston,
Vladimir Kossobokov, Cinna Lomnitz, Andy Michael, Franceso Mulargia, Yosi
Ogata, David Rhoades, Russell Robinson, Renata Rotondi, Martha Savage, Rick
Schoenberg, Warwick Smith, Ilya Zaliapin, Jiancang Zhuang.

To access this journal online:


http://www.birkhauser.ch

You might also like