Solving the Woes of Statistical Seismology: The Community On-Line Resource for Statistical Seismicity Analysis (CORSSA) l (original) (raw)

Statistical methods in seismology

Wiley Interdisciplinary Reviews: Computational Statistics, 2010

On July 16, 1945 the first nuclear weapon test (code named Trinity) began an evolution in warfare that led to the cold war where mutually assured destruction (MAD) prevented nuclear conflict between nations. The generation that fought the cold war understood the destructive power of a nuclear weapon-many had observed first hand the ruins of Nagasaki and Hiroshima. The modern nuclear weapon has the potential to kill over 1,000,000 people in seconds if detonated in a large metropolitan city. In contrast to the man-made threat of nuclear weapons, there are an estimated 230,000 people dead or presumed so as a result of the 2004 Indian Ocean tsunami. Seismology is the core science in monitoring for nuclear weapon tests worldwide-an essential function in global efforts to eliminate nuclear weapons. Equally important, seismology provides the theory and methods to monitor and warn for natural threats such as the Indian Ocean tsunami. Today, many scientists continue research and development efforts to more effectively monitor natural seismic activity and eliminate nuclear weapons from the globe. This article reviews an important aspect of that research-mathematical statistics contributions to seismic monitoring with emphasis on underground nuclear weapon test monitoring. 

Statistical Seismology

Pure and Applied Geophysics, 2005

Stochastic models with an increasing component of physical reasoning have been slowly gaining acceptance over the last two decades. The subject of statistical seismology aims to bridge the gap between physics-based models without statistics, and statistics-based models without physics. This volume, which is based largely on papers presented at the 3rd International Workshop on Statistical Seismology, held in Juriquilla, Mexico in May 2003, may serve to illustrate the range of issues now coming under the statistical seismology heading. While the papers presented may not solve the problem of bridging the gap, they indicate routes by which it is being approached.

Seismology as a dynamic, distributed area of scientific research

Scientometrics, 2003

Seismology has several features that suggest it is a highly internationalized field: the subject matter is global, the tools used to analyse seismic waves are dependent upon information technologies, and governments are interested in funding cooperative research. We explore whether an emerging field like seismology has a more internationalised structure than the older, related field of geophysics. Using aggregated journal-journal citations, we first show that, within the citing environment, seismology emerged from within ...

A study of seismology as a dynamic, distributed area of scientific research

arXiv preprint arXiv:0912.1371, 2009

Abstract: Seismology has several features that suggest it is a highly internationalized field: the subject matter is global, the tools used to analyse seismic waves are dependent upon information technologies, and governments are interested in funding cooperative research. We explore whether an emerging field like seismology has a more internationalised structure than the older, related field of geophysics. Using aggregated journal-journal citations, we first show that, within the citing environment, seismology emerged from within ...

BOSCHI Statistical Seismology : Physical and Stochastic Modelling of Earthquake Occurrence and Forecasting 31

2007

Point processes and probability theory allow to study in depth several aspects of seismological research, such as the definition of models describing space-time occurrence of seismic events, the simulation of seismic catalogs to better understand the real process, the definition of predictive strategies for the assessment of seismic risk. A more realistic description of seismic events often requires the relaxation of the assumption of statistical independence of earthquakes. Therefore, since second-order properties of point processes, together with models that account for the complexity of data, may have a relevant role in the study and the comprehension of the seismic process and its realization, complex models process and features like self-similarity, long-range dependence and fractal dimension, are reviewed. Second-order properties of point processes may have a relevant role in the study and the comprehension of the seismic process and its realization, together with models that account for the complexity of data. When models describing earthquakes occurrence are fitted, residuals analysis is carried out to test the goodness of this fitting. Here a diagnostic method is proposed and applied to real data; it is based on a transformed version of second-order statistics, weighting the original point process by the inverse of its conditional intensity, so that features such as clustering and inhibition may easily be interpreted. Indeed, the null hypothesis model is an arbitrary conditional intensity model, rather than a stationary Poisson process. The proposed diagnostic method is applied to seismic data and interpreted after estimating and fitting a particular space-time process (ETAS model-Epidemic Type Aftershocks-Sequences model) to seismic events.

Database of Individual Seismogenic Sources, version 3.1. 0

2009

DISS Working Group has increased and updated the content of the Database of Individual Seismogenic Sources (DISS), and improved its functions and structure. New improvements and new data have been implemented as the new DISS release, version 3.1. 0. DISS Working Group focused on furthering the characterization of the already existing seismogenic sources, both Individual Sources and Composite Sources. The Working Group implemented and completed background information and support data, images, and texts. ...

Cross-correlation Coefficients for the Study of Repeating Earthquakes: An Investigation of Two Empirical Assumptions/Conventions in Seismological Interpretation Practice

Pure and Applied Geophysics, 2012

For the identification and analysis of 'repeating earthquakes,' there are two empirical concepts. The first is the assumption that the cross-correlation coefficient of the filtered seismograms of closely spaced 'repeaters' depends exponentially on the inter-event separation distance. The second is the convention that in processing regional seismograms, a 0.5-5.0-Hz band pass filter is used. In this article, using a simple layered structure model, we investigated the cross-correlation coefficient of the filtered synthetic seismograms of two closely located events, that is, a 'doublet.' We investigated the relation between the cross-correlation coefficient and the inter-event separation distance. Simulation shows that in the 0.5-5.0-Hz frequency band, even if for simple synthetic seismograms without considering lateral heterogeneity or scattering, the exponential dependence is only a first order approximation concept. To check the frequency dependence of the cross-correlation coefficient, we analyzed a group of seismograms of a 'multiplet' in Xiuyan, Liaoning, northeast China, recorded by the Regional Seismographic Network of Liaoning Province. The cross-correlation coefficients were observed to be relatively stable against frequency for the 0.5-5.0-Hz frequency band.

SGRAPH (SeismoGRAPHer):Seismic waveform analysis and integrated tools in seismology

Abdelwahed M.F., 2012

Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.