Benchmarking Bayesian methods for spectroscopy (original) (raw)
Abstract
<p class="p1"><span class="s1"><strong>Introduction:</strong></span><span class="s2"> Spectroscopic data is rich and powerful to study surfaces. Besides, planetary surfaces are often made of intimately mixed components [1, 2], so modelling a reflectance spectrum will mean fitting a relatively large number of inter-related parameters: at least, one proportion and one grain size for each component in the case of a granular mixture, plus structure parameters (roughness, porosity). In these large parameters spaces, it is highly possible that multiple solutions give an equally satisfactory fit [3]. It is crucial to evaluate the ability of a method to retrieve multiple solutions when setting up an inversion strategy. </span></p> <p class="p2">&#160;</p> <p class="p1"><span class="s1"><strong>Data:</strong></span><span class="s2"> This work was done in the context of revisiting the Europa Galileo NIMS dataset. We compared the solutions of different methods on a radiative transfer model, based on Hapke modeling [1] and an observation of a bright region of Europa NIMS cube 14e006ci [4] with 4 components : crystalline ice, hexahydrite, magnetite and sulfuric acid octahydrate. The uncertainty on the data is assumed to be gaussian with a standard deviation at 10% with a minimum at 0.01 in reflectance.</span></p> <p class="p1"><span class="s1"><strong>Method:</strong></span><span class="s2"> The parameters space is on dimension 9, with 8 independent parameters: 4 abundances, 4 grain sizes and the surface roughness. We noted early in our work that while reproducing the data equally well, two minimisations algorithm would give different results for the parameters, meaning the solution is not unique. To explore the set of possible solutions, four methods were compared: (i) home-made Markov Chain Monte-Carlo (MCMC) method with metropolis hasting sampler [5] (ii) home-made MCMC method with improved metropolis-hasting sampler [this work] (iii) open-source multi-chain MCMC algorithm with "snooker" sampler [6] (iv) multiple minimizations, using &#8221;L-BFGS-B&#8221; bound-constrained algorithm with random initialisations [7]. </span></p> <p class="p1"><span class="s2">To be able to compare the efficiency of the different methods, we limited the number of direct model evaluations to 1.5.10</span><span class="s3"><sup>6</sup></span><span class="s2">. This number was chosen for 2 reasons: (i) each algorithm tested seemed to have reached "convergence": increasing the number of iterations did not change significantly the result, and (ii) identical and affordable computational cost between the methods. This means that for MCMC methods, we set the number of iterations to 1.5.10</span><span class="s3"><sup>6</sup></span><span class="s2">. For the multiple minimizations method, each minimization resulted in approximately 1500 model evaluations to reach the result. So we performed 1000 minimisation with 1000 different random initialisations. For each of the model evaluation, a likelihood is computed, making the comparison between this method and the bayesian ones possible. </span></p> <p class="p2"><img src="" alt="" /></p> <p class="p1"><em><span class="s1">Figure 1: Corner plot and best fit for home-made MCMC method with Metropolis-Hasting sampler with a converged chain of 1.5 10</span><span class="s2"><sup>6</sup></span><span class="s1"> iterations. The sampling is done with 3 cases: agnostic uniform distribution over the full prior space, in far neighbourhood, in close neighbourhood. The corner plot represents the pairwise posterior distributions, and the marginal posterior distributions for each parameter. Parameters are from left to right roughness, 4 abundances, 4 grain sizes and from top to bottom 4 abundances and 4 grain sizes. Acceptance rate is 0.217.</span></em></p> <p class="p1">&#160;</p> <p class="p1"><span class="s1"><strong><img src="" alt="" /></strong></span></p> <p class="p1"><em><span class="s1">Figure 2: Same as fig 1 but in this case the sampling is done with 4 cases: agnostic uniform distribution over the full prior space, in far neighbourhood, in close neighbourhood, 1 single parameter modification.<span class="Apple-converted-space">&#160; </span>Acceptance rate is 0.288. </span></em></p> <p class="p1"><em><span class="s1"><img src="" alt="" /></span></em></p> <p class="p1"><em><span class="s1">Figure 3: Same as Fig1 but<span class="Apple-converted-space">&#160; </span>for 1000…
Thomas Cornet hasn't uploaded this paper.
Let Thomas know you want this paper to be uploaded.
Ask for this paper to be uploaded.