Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
9 pages
1 file
Characterization of seismic event rates in time and space is necessary for performing accurate probabilistic hazard assessment. Deep-water injections, as those related to disposal of wastewater in energy production, can increase seismic hazard [1]. However, due to the highly non-stationary process of induced seismicity, existing methods for characterizing tectonic events [2-3] cannot be directly applied in this analysis. This study presents a general sequential Bayesian inference method for characterizing spatio-temporal seismic activities, broadly relevant to model the effects of fluid injections and of other sources of induced seismicity, by recursively updating the event rate based on current observations. We assume that recordings of seismic activities are collected and available to be processed. The method relies on discretizing a region in a spatial grid and a time period into time intervals. The seismic event rate is described at points on that spatiotemporal domain. Due to t...
2000
Two new different stochastic models for earthquake occurrence are discussed. Both models are focusing on the spatio-temporal interactions between earthquakes. The parameters of the models are estimated from a Bayesian updating of priors, using empirical data to derive posterior distributions. The first model is a marked point process model in which each earthquake is represented by its magnitude and coordinates in space and time. This model incorporates the occurrence of aftershocks as well as the build-up and subsequent release of strain. The second model is a hierarchical Bayesian space-time model in which the earthquakes are represented by potentials on a grid. The final ambition of the models is to make predictions on the occurrence of earthquakes. 1
2000
Two new different stochastic models for earthquake occurrence are discussed. Both models are focusing on the spatia-temporal interactions between earthquakes. The parameters of the models are estimated from a Bayesian updating of priors, using empirical data to derive posterior distributions. The first model is a marked point process model in which each earthquake is represented by its magnitude and coordinates in space and time. This model incorporates the occurrence of aftershocks as well as the build-up and subsequent release of strain. The second model is a hierarchical Bayesian space-time model in which the earthquakes are represented by potentials on a grid. The final ambition of the models is to make predictions on the occurrence of earthquakes.
2011
Machine learning consists of a set of computational tools for performing large multi-dimensional data set analysis where standard statistical tests are not easily implemented. Many parametric approaches for machine learning consist of model selection and at least a two-step process. Using these techniques the underlying structure of the observed data may not be fully realised. On the other hand, Bayesian non-parametric methods perform inference operations over an infinitely greater number of parameters and because the inherent model uncertainty is also incorporated in the single-step approach, this can lead to a more robust estimation of resulting values. This paper applies this approach to the modelling geophysical events, which is a challenging spatio-temporal problem domain. This paper contributes to the ongoing investigation of optimal methods for geophysical event modelling by introducing a numerical computation solution using a Bayesian unsupervised learning algorithm with ear...
Nuclear Engineering and Design, 2019
Since the basic work of Cornell, many studies have been conducted in order to evaluate the probabilistic seismic hazard (PSHA) at a given site or at a regional scale. In general, results of such studies are used as inputs for regulatory hazard maps or for risk assessments. Such approaches are nowadays considered as well established and come more and more used worldwide, in addition to deterministic approaches. Nevertheless, some discrepancies have been observed recently in some PSHA, especially from studies conducted in areas with low to moderate seismicity. The lessons learnt from these results lead to conclude that, due to epistemic uncertainties inherent to such a domain, some deterministic choices have to be made and, depending on expert judgments, may lead to strong differences in terms of seismic motion evaluation. In this context, the objective of this paper is to present a methodology that can be used to take into consideration instrumental and historical observations in order to reduce epistemic uncertainties in a PSHA. The method developed here is based on a Bayesian inference technique used in order to quantify the likelihood of the prior estimation, and finally update the PSHA. The period of observation under consideration is the completeness period for each set of observation used. The updating process is developed at a regional scale, over a significant number of stations. The potential correlation between points of observation is also discussed and accounted for. Finally, a case of application is proposed on the French metropolitan territory to demonstrate the efficiency of this updating method and draw perspectives for further applications.
2011
The automated processing of multiple seismic signals to detect and localize seismic events is a central tool in both geophysics and nuclear treaty verification. This paper reports on a project, begun in 2009, to reformulate this problem in a Bayesian framework. A Bayesian seismic monitoring system, NET-VISA, has been built comprising a spatial event prior and generative models of event transmission and detection, as well as an inference algorithm. Applied in the context of the International Monitoring System (IMS), a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT), NET-VISA achieves a reduction of around 60% in the number of missed events compared to the currently deployed system. It also finds events that are missed even by the human analysts who post-process the IMS output.
Geophysical Research Letters, 2017
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. Please cite this article as
2011
The mathematical representation of seismic sources is an important part of probabilistic seismic hazard assessment. It reflects the association of the seismicity with the tectonically-active geological structures evidenced by seismotectonic studies. Given that most active faults are not characterized well enough, seismic sources are generally defined as areal zones, delimited with finite boundary polygons, within which the geological features of active tectonics and the seismicity are deemed homogeneous (e.g., focal depth, seismicity rate, and maximum magnitude). Besides the lack of data (e.g., narrow range of recorded magnitudes), the application of this representation generates different problems: 1) a large sensitivity of resulting hazard maps on the location of zone boundaries, while these boundaries are set by expert decision; 2) the zoning can not represent any variation in faulting mechanism; 3) the seismicity rates are distributed throughout the zones and we lose the location of the determinant information used for their calculation. We propose an exploratory study for an alternative procedure in area source modeling. First, different data (e.g., geomorphology, geology, fault orientations) will be combined by using automated spatial partitioning (investigation of both supervised and unsupervised methods) in order to obtain several information classes, which may be defined as areal source zones. Then, a given hypocenter belonging to a given "zone", from now on called seismicity model, will be expressed by a probability computed from the 2D (spatial) probability density function (pdf) for the active tectonic model used as an a priori and updated with specific data from seismicity catalogs (e.g., focal mechanism) or other new data sources (e.g., geomorphology, subsurface exploration). This hypocenter will thus be allowed to contribute to several models, with weights given by the value of the pdf for each model. The annual rate of occurrence, for a given model, will be calculated by the weighted average of the different hypocenter contributions contained in this model. Future applications will couple the seismicity models to Ground Motion Prediction Equations. In consequence, the results will provide the full spectrum of variability in the hazard and will highlight zones badly constrained and deserving to be more studied.
Natural Hazards, 1997
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).
International Journal of Soil Dynamics and Earthquake Engineering, 1984
This paper first describes the inferential structure of the Bayesian model and uses it to show why the empirical method using only short historical earthquake data cannot obtain a reliable hazard estimation. For improving the hazard prediction, newly developed information from the geophysical and geological studies should be incorporated with the historical data within the Bayesian framework. The paper, then, concentrates on the methodologies of how to use the energy flux concept, seismic moment and geological observation in the seismic hazard analysis. A refined Bayesian model, where some of the geophysical and geological input can be used flexibly and consistently, is suggested. Finally, numerical examples are presented for illustrating the application of the improved method.
Geophysical Journal International, 2007
We formulate the multiple-event seismic location problem as a Bayesian hierarchical statistical model (BAYHLoc). This statistical model has three distinct components: traveltime predictions, arrival-time measurements, and an a priori statistical model for each aspect of the multiple-event problem. The traveltime model is multifaceted, including both phase-specific adjustments to traveltime curves to improve broad-area prediction, as well as path/phase-specific traveltime adjustments. The arrival-time measurement model is decomposed into station, phase, and event components, allowing flexibility and physically interpretable error parameters. The prior model allows all available information to be brought to bear on the multiple event system. Prior knowledge on the probabilistic accuracy of event locations, traveltime predictions, and arrival-time measurements can be specified. Bayesian methodology merges all three components of the hierarchical model into a joint probability formulation. The joint posterior distribution is, in essence, an inference of all parameters given the prior constraints, self-consistency of the data set, and physics of traveltime calculation. We use the Markov Chain Monte Carlo method to draw realizations from the joint posterior distribution. The resulting samples can be used to estimate marginal distributions, for example epicentres and probability regions, as well as parameter correlations. We demonstrate BAYHLoc using the set of Nevada Test Site nuclear explosions, for which hypocentres are known, and phase measurements and traveltimes are well characterized. We find significant improvement in epicentre accuracy using BAYHLoc. Much of the improvement is attributed to the adaptive arrival-time measurement model, which controls data weighting. Regardless of the initial traveltime model, the use of an adjustment to the traveltime curves produces a level of epicentre accuracy that is generally achieved only through meticulous analysis and data culling. Further, we find that accurate hypocentres (including depth and origin-time) are achieved when either accurate traveltime curves with tight priors are used, or when prior information on a small subset of events is utilized.
Curso Paganismo y cristianismo , 2018
Revista Direitos Sociais e Políticas Públicas (UNIFAFIBE), 2022
XIX. TÜRK TARİH KONGRESİ, 2024
2024
2024
Interface - Comunicação, Saúde, Educação
The New Americanist, 2019
VII Jornadas Internacionales de Innovación Universitaria, 2010
1998
Paper presented at commemorative symposium for John Shotter, BPS Developmental Section Annual Conference, 2017
Religious Diversity and Interreligious Dialogue, 2020
Al-Aijaz Research Journal of Islamic Studies & Humanities
Plant Physiology, 2004
Computers and Education: Artificial Intelligence, 2024
Series II: Forestry Wood Industry Agricultural Food Engineering, 2021
The World Bank Economic Review, 2003