Principles of Bayesian Methods in Data Analysis (original) (raw)
Related papers
Methods in Molecular Biology, 2009
Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the socalled posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.
Bayesian theory of systematic measurement deviations
2010
Concerning systematic effects, the recommendation given in the GUM is to correct for them, but unfortunately no detailed information is available, how to do this. This publication will show, how systematic measurement deviations can be handled correctly based on the Bayesian probability theory. After a short overview about useful methods and tools, like the product rule of probability theory, Bayes' theorem, the principle of maximum entropy, and the marginalisation equation, an outline of a method to handle systematic measurement deviations is introduced. Finally some simple examples of practical interest are given, in order to demonstrate the applicability of the suggested method.
Bayesian Analysis of Experimental Data
Pakistan Journal of Statistics and Operation Research, 2013
Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for a prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
Bayesian approach to uncertainty evaluation: is it always working?
19th International Congress of Metrology (CIM2019), 2019
Since the GUM has been published, measurement uncertainty has been defined in terms of the standard deviation of the probability distribution of the values that can be reasonably attributed to the measurand, and it has been evaluated using statistical or probabilistic methods. A debate has always been alive, among the metrologists, on whether a frequentist approach or a Bayesian approach should be followed to evaluate uncertainty. The Bayesian approach, based on some available a-priori knowledge about the measurand seems to prevail, nowadays. This paper starts from the consideration that the Bayesian approach is based on the well-known Bayes theorem that, as all mathematical theorems, is valid only to the extent the assumptions made to prove it are valid. The main question, when following the Bayesian approach, is hence whether these assumptions are satisfied in the practical cases, especially when the a-priori information is combined with the information coming from the measurement...
Using Bayesian Analysis to Quantify Uncertainty in Radiometer Measurements
Journal of Verification, Validation and Uncertainty Quantification, 2021
We apply Bayesian inference to instrument calibration and experimental-data uncertainty analysis for the specific application of measuring radiative intensity with a narrow-angle radiometer. We develop a physics-based instrument model that describes temporally varying radiative intensity, the indirectly measured quantity of interest, as a function of scenario and model parameters. We identify a set of five uncertain parameters, find their probability distributions (the posterior or inverse problem) given the calibration data by applying Bayes' Theorem, and employ a local linearization to marginalize the nuisance parameters resulting from errors-in-variables. We then apply the instrument model to a new scenario that is the intended use of the instrument, a 1.5 MW coal-fired furnace. Unlike standard error propagation, this Bayesian method infers values for the five uncertain parameters by sampling from the posterior distribution and then computing the intensity with quantifiable u...
Statistics-based Bayesian Modeling Framework for Uncertainty Quantification and Propagation
2021
A new Bayesian modeling framework is proposed to account for the uncertainty in the model parameters arising from model and measurements errors, as well as experimental, operational, environmental and manufacturing variabilities. Uncertainty is embedded in the model parameters using a single level hierarchy where the uncertainties are quantified by Normal distributions with the mean and the covariance treated as hyperparameters. Unlike existing hierarchical Bayesian modelling frameworks, the likelihood function for each observed quantity is built based on the Kullback–Leibler divergence used to quantify the discrepancy between the probability density functions (PDFs) of the model predictions and measurements. The likelihood function is constructed assuming that this discrepancy for each measured quantity follows a truncated normal distribution. For Gaussian PDFs of measurements and response predictions, the posterior PDF of the model parameters depends on the lower two moments of th...
Ingenieria y Universidad, 2022
Objective: To propose a methodological procedure that serves as a guide for applying techniques in the measurement uncertainty evaluation, such as GUM, MMC, and Bayes; in addition, to develop an application in a non-trivial case study. Materials and methods: In this paper, a set of steps are proposed that allow validating the measurement uncertainty evaluation from techniques such as GUM, MMC, and Bayes; these were applied as a strategy to evaluate the uncertainty of an indirect measurement process that sought to determine the level of a fluid by measuring the hydrostatic pressure generated by it at rest on the bottom of a container. The results obtained with each technique were compared. Results and discussion: the use of the GUM was found to be valid for the case under study, and the results obtained by applying the Bayesian approach and the MC technique provided highly useful complementary information, such as the Probability Density Function (PDF) of the measurand, which enables a better description of the phenomenon. Likewise, the posterior PDF obtained with Bayes allowed us to approximate closer values around the true values of the measurand, and the ranges of the possible values were broader than those offered by the MMC and the GUM. Conclusions: In the context of the case under study, the Bayesian approach presents more realistic results than GUM and MMC; in addition to the conceptual advantage presented by Bayes, the possibility of updating the results of the uncertainty evaluation in the presence of new evidence.
Objective Bayesian analysis of spatial data with measurement error
Canadian Journal of Statistics, 2007
The author shows how geostatistical data that contain measurement errors can be analyzed objectively by a Bayesian approach using Gaussian random fields. He proposes a reference prior and two versions of Jeffreys' prior for the model parameters. He studies the propriety and the existence of moments for the resulting posteriors. He also establishes the existence of the mean and variance of the predictive distributions based on these default priors. His reference prior derives from a representation of the integrated likelihood that is particularly convenient for computation and analysis. He further shows that these default priors are not very sensitive to some aspects of the design and model, and that they have good frequentist properties. Finally, he uses a data set of carbodnitrogen ratios from an agricultural field to illustrate his approach.