Application of Bayesian reasoning and the Maximum Entropy Method to some reconstruction problems (original) (raw)

The prior-derived F constraints in the maximum-entropy method

Acta Crystallographica Section A Foundations of Crystallography, 2005

The method of the prior-derived F constraints (PDC) enhances the quality of reconstructions of electron densities from X-ray diffraction data by the maximum-entropy method (MEM). The method concentrates on artifacts arising due to inaccurate extrapolation of non-measured data by the MEM. While these artifacts are unavoidable, when a flat prior is used, they can be effectively suppressed, if the prior information about the structure is known in the form of a procrystal prior electron density. The missing, usually high-angle, structure factors can be effectively substituted by the structure factors derived from the procrystal prior. This approach eliminates the occurrence of spurious peaks in the difference electron densities in the vicinity of the atomic positions. The method is illustrated with a simple one-dimensional example. Its use is then demonstrated on simulated data of oxalic acid dihydrate and on experimental data of sodium nitrite. ‡ Present address: Institute of Physics, Academy of Sciences of the Czech Republic, 162 53 Prague, Czechia.

Bayesian Approach for X-Ray and Neutron Scattering Spectroscopy

Inelastic X-Ray Scattering and X-Ray Powder Diffraction Applications, 2020

The rapidly improving performance of inelastic scattering instruments has prompted tremendous advances in our knowledge of the high-frequency dynamics of disordered systems, yet also imposing new demands to the data analysis and interpretation. This ongoing effort is likely to reach soon an impasse, unless new protocols are developed in the data modeling. This need stems from the increasingly detailed information sought for in typical line shape measurements, which often touches or crosses the boundaries imposed by the limited experimental accuracy. Given this scenario, the risk of a bias and an over-parametrized data modeling represents a concrete threat for further advances in the field. Being aware of the severity of the problem, we illustrate here the new hopes brought in this area by Bayesian inference methods. Making reference to recent literature results, we demonstrate the superior ability of these methods in providing a probabilistic and evidence-based modeling of experimental data. Most importantly, this approach can enable hypothesis test involving competitive line shape models and is intrinsically equipped with natural antidotes against the risk of over-parametrization as it naturally enforces the Occam maximum parsimony principle, which favors intrinsically simple models over overly complex ones.

The use of maximum entropy and Bayesian techniques in nuclear microprobe applications

Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms, 1997

This paper presents an overview of the Bayesian Formalism (BF) and Maximum Entropy (ME), as well as applications of the theory to various ion-beam and nuclear microprobe (NMP) related work. The Bayesian formalism is an efficient and theoretically sound technique of information recovery, with special applications in ill-posed inverse problems, such as detector function deconvolution. Results are presented showing the promise of Bayesian Statistics and Maximum Entropy in PIXE spectrum deconvolution, deconvo~ution of the beam profile from one-dimensional scans and the recovery of depth profiles in RBS. 0 1997 Elsevier Science B.

Maximum entropy and Bayesian data analysis: Entropic prior distributions

Physical Review E, 2004

The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.

Bayesian Reconstruction Of Electron Energy Distributions

Low-pressure plasmas are nowadays widely used for technical applications of plasma-surface interactions, such as plasma etching, material deposition, sputtering, etc. For a thorough understanding of individual processes in plasma processing the electron energy distribution (EED) function in the bulk plasma is of great importance. The EED determines the rates of all electron induced reactions as ionization, excitation or dissociation of molecules. The ubiquitous assumption of a Maxwellian EED becomes progressively worse for hot and low-density plasmas. Measurements of the EED with probes penetrating the plasma result in deteriorating e ects on the plasma and the probe, thus measurements without plasma contact are of great interest. A non-destructive measurement is the detection of radiation emitted by the plasma.

Lessons about likelihood functions from nuclear physics

2007

Least-squares data analysis is based on the assumption that the normal (Gaussian) distribution appropriately characterizes the likelihood, that is, the conditional probability of each measurement d, given a measured quantity y, p(d|y). On the other hand, there is ample evidence in nuclear physics of significant disagreements among measurements, which are inconsistent with the normal distribution, given their stated uncertainties. In this study the histories of 99 measurements of the lifetimes of five elementary particles are examined to determine what can be inferred about the distribution of their values relative to their stated uncertainties. Taken as a whole, the variations in the data are somewhat larger than their quoted uncertainties would indicate. These data strongly support using a Student t distribution for the likelihood function instead of a normal. The most probable value for the order of the t distribution is 2.6+/-0.9. It is shown that analyses based on long-tailed t-...

Regularization, maximum entropy and probabilistic methods in mass spectrometry data processing problems

2002

This paper is a synthetic overview of regularization, maximum entropy and probabilistic methods for some inverse problems such as deconvolution and Fourier synthesis problems which arise in mass spectrometry. First we present a unified description of such problems and discuss the reasons why simple naïve methods cannot give satisfactory results. Then we briefly present the main classical deterministic regularization methods, maximum entropy-based methods and the probabilistic Bayesian estimation framework for such problems. The main idea is to show how all these different frameworks converge to the optimization of a compound criterion with a data adequation part and an a priori part. We will however see that the Bayesian inference framework gives naturally more tools for inferring the uncertainty of the computed solutions, for the estimation of the hyperparameters or for handling the myopic or blind inversion problems. Finally, based on Bayesian inference, we present a few advanced methods particularly designed for some mass spectrometry data processing problems. Some simulation results illustrate mainly the effect of the prior laws or equivalently the regularization functionals on the results one can obtain in typical deconvolution or Fourier synthesis problems arising in different mass spectrometry technique.

Application of the Approximate Bayesian Computation Algorithm to Gamma-Ray Spectroscopy

Algorithms

Radioisotope identification (RIID) algorithms for gamma-ray spectroscopy aim to infer what isotopes are present and in what amounts in test items. RIID algorithms either use all energy channels in the analysis region or only energy channels in and near identified peaks. Because many RIID algorithms rely on locating peaks and estimating each peak’s net area, peak location and peak area estimation algorithms continue to be developed for gamma-ray spectroscopy. This paper shows that approximate Bayesian computation (ABC) can be effective for peak location and area estimation. Algorithms to locate peaks can be applied to raw or smoothed data, and among several smoothing options, the iterative bias reduction algorithm (IBR) is recommended; the use of IBR with ABC is shown to potentially reduce uncertainty in peak location estimation. Extracted peak locations and areas can then be used as summary statistics in a new ABC-based RIID. ABC allows for easy experimentation with candidate summar...

Bayesian Multiscale Deconvolution Applied to Gamma-ray Spectroscopy

2003

A common task in gamma-ray astronomy is to extract spectral information, such as model constraints and incident photon spectrum estimates, given the measured energy deposited in a detector and the detector response. This is the classic problem of spectral “deconvolution” or spectral inversion [2]. The methods of forward folding (i.e. parameter fitting) and maximum entropy “deconvolution” (i.e. estimating independent input photon rates for each individual energy bin) have been used successfully for gamma-ray solar flares (e.g. [5]). Nowak and Kolaczyk [4] have developed a fast, robust, technique using a Bayesian multiscale frame-work that addresses many problems with added algorithmic advantages. We briefly mention this new approach and demonstrate its use with time resolved solar flare gamma-ray spectroscopy.