Predictive Understanding of Disasters: Universality of Precursory Phenomena
We consider predictability of extreme events, also known as disasters, catastrophes and critical phenomena. Such events are persistently generated by natural and societal complex systems. Complex systems are not predictable with absolute precision. However, their predictability can be enhanced by coarse graining. Notwithstanding its limited accuracy, such prediction is important for the optimal choice of disaster preparedness measures by decision makers. We consider prediction of extreme events based on premonitory patterns of the background activity of the system. Methodology for developing prediction algorithms integrates modeling and exploratory data analysis by using pattern recognition of rare events. Prediction is formulated as a sequence of alarms, each indicating the time interval, area, and magnitude range of an expected extreme event. The final validation of a prediction algorithm is predicting future events. Complex systems exhibit many universal features, common for the systems of different origin. Among such features are scale invariance, clustering and long-range correlation. Here we describe premonitory patterns common for many models of complex systems and for a wide variety of observed real extreme events. Among them are strong earthquakes, economic recessions, surges of unemployment, and homicide surges.
Extreme Seismic Events: From Modeling and Prediction to Preventive Disaster Management
Extreme seismic events (e.g., 1755 Lisbon, 1906 San Francisco, 2004 Aceh-Sumatra 2008 Wenchuan earthquakes) are a manifestation of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Driven by mantle convection these lithospheric blocks are involved into relative movement, resulting in stress localization and earthquakes. Numerical models of block-and-fault dynamics featuring large seismic events and their clustering are discussed along with the application to the Tibet- Himalayan and the Sunda Arc regions. Although the lithosphere behaves as a large non-linear system, some integral empirical regularities emerge indicating possibilities for earthquake prediction. Large earthquakes are surprising, and society, as a matter of fact, is poorly prepared to deal with them. Protecting human life and property against earthquake disasters requires an uninterrupted chain of research and civil protection tasks: from understanding of physics of earthquakes, their analysis and monitoring, through interpretation, modeling, seismic hazard assessment, and earthquake prediction, to delivery the scientific forecasts to local authorities, public awareness, preparedness, and preventive disaster management.
Earthquake Forecasting, Validation and Verification
Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well
as numerical simulations. The former rely on the development of methods to recognize patterns in data, while
the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems.
Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A
large number of techniques to validate and verify forecasts have been developed for weather and financial
applications. Many of these have been elaborated in public locations, including, for example, the URL as listed
below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is
characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be
readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well
as validation and verification using a number of these standard methods. We show how these test methods
might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and
simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern
Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast
verification is straightforward. Judgments about the quality of a given forecast method can often depend on the
test applied, as well as on the preconceptions and biases of the persons conducting the tests.
Pattern Informatics Approach to Earthquake Forecasting in 3D
Natural seismicity is correlated across multiple spatial and temporal scales, but correlations in seismicity prior to a large earthquake are locally subtle (e.g. seismic quiescence) and often prominent in broad scale (e.g., seismic activation), resulting in local and regional seismicity patterns, e.g. a Mogi's donut. Recognizing that patterns in seismicity rate are reflecting the regional dynamics of the directly unobservable crustal stresses, the Pattern Informatics (PI) approach was introduced by Tiampo et al. in 2002 [Europhys. Lett., 60 (3), 481-487,] Rundle et al., 2002 [PNAS 99, suppl. 1, 2514-2521.] In this study, we expand the PI approach to forecasting earthquakes into the third, or vertical dimension, and illustrate its further improvement in the forecasting performance through case studies of both natural and synthetic data. The PI characterizes rapidly evolving spatio-temporal seismicity patterns as angular drifts of a unit state vector in a high dimensional correlation space, and systematically identifies anomalous shifts in seismic activity with respect to the regional background. 3D PI analysis is particularly advantageous over 2D analysis in resolving vertically overlapped seismicity anomalies in a highly complex tectonic environment. Case studies will help to illustrate some important properties of the PI forecasting tool. [Submitted to: Concurrency and Computation: Practice and Experience, Wiley, Special Issue: ACES2008.]
Record Breaking Earthquakes and Temperatures
A record-breaking event is the largest (smallest) event to occur during a specified time window in a specified region. In this paper we consider the record-breaking statistics of global earthquakes and temperatures at a specified observatory. For the record-breaking earthquakes we consider the global CMT catalog for the period 1977 to 2007. We determine the numbers and magnitudes of the record-breaking earthquakes during specified time intervals. The results are in excellent agreement with the theory for random events. For record- breaking temperatures we utilize records from the Mauna Loa Observatory, Hawaii, for the period 1977 to 2006. We choose this site because observations made at it established the systematic increase in anthropogenic CO2 in the atmosphere. We have averaged over the 365 days of the year and have determined the average numbers of record-beaking high temperatures and low temperatures measured both forward and backward. We give results as a function of the time of day. We have carried out numerical simulations to quantitatively relate our data to temperature trends. We find a near uniform warming trend of 0.04 °C yr-1 at night and little change during the day.
Record Breaking Events in Self-Organized Critical Systems
Record breaking events are analyzed in driven nonlinear threshold systems. These systems usually exhibit avalanche type behavior, where slow buildup of energy is punctuated by an abrupt release of energy through avalanche events which usually follow scale invariant statistics. During nonlinear dynamics of these system it is possible to extract a sequence of record breaking events, where each subsequent record breaking event is larger in magnitude than the previous one and all events in between are smaller than the current record breaking event and the previous one. Statistics of record breaking events exhibit rich temporal organization with emergence of correlations and power-law tails. In the present work, several cellular automata are analyzed among them the sandpile model, Manna model, Olami-Feder-Christensen model, the forest-fire model, and the slider-block model to investigate the record breaking statistics of model avalanches. The results are compared to the statistics of independent identically distributed (i.i.d.) random variables generated according to prescribed distributions. In particular, Weibull distribution is considered. It is found that the statistics of record breaking events for the above cellular automata exhibit behavior different from that observed for i.i.d. random variables which signifies their complex spatio-temporal dynamics.
Extreme Events and Their Predictability in a Branching Diffusion Model
Studies in prediction of extreme events, based on real observations and numerical modeling of complex systems, suggest universal patterns of system's behavior signaling approach of an extreme event. These patterns include deviation from self-similarity, increase in background activity, clustering, and long-range correlation. In the absence of a closed theory describing critical transitions in complex systems, and with insufficient and noisy observations, numerical parameters of the patterns have to be data-fitted, creating the risk of self-deception ("With four exponents I can fit the elephant" - J. von Neumann). Here, we introduce a model which provides analytical definition of at least two parameters, intensity and deviation from self-similarity. That drastically reduces non-uniqueness of parametrization, suggesting a simple universal mechanism of premonitory patterns and natural framework for their analytical study. Major conceptual parts of the model - direct cascading or fragmentation, spatial dynamics, and external driving - are combined in a classical age-dependent multi-type branching diffusion process with immigration. A complete analytic description of the size- and space-dependent distributions of particles and their correlations is derived using the generating function approach.
Stochastic modeling of long-term non-stationary processes with empirical mode decomposition
Long-term nonstationary oscillations have been observed in climatological data series such as global surface temperature anomalies (GSTA) and pacific decadal oscillation (PDO) index values. Stochastic simulation models that contain a nonstationary process are useful to predict the variations of climatic variables and study their impacts on hydrologic regimes. In this work, we present a stochastic simulation model that captures nonstationary oscillations within a given variable. The model employs the data adaptive decomposition method developed by Huang et al. (2009) and named Empirical Mode Decomposition (EMD). Irregular oscillatory processes in a given variable can be extracted into a finite number of intrinsic mode functions with the EMD approach. A unique data-adaptive algorithm is proposed in this paper in order to simulate the nonstationary oscillatory components extracted from EMD. To validate the model performance, it is applied to GSTA data in which the last 30 years are truncated. The observations of the last 30 years are then compared to the data generated from the model. Finally, the next 50 years of data are generated to predict the evolution of GSTA data in the future. Results of the validation study confirm the power of the EMD approach and its potential in the study of climate variability.