Adjustment of nuclear data libraries using integral benchmarks (original) (raw)

Nuclear data uncertainty propagation to integral responses using SANDY

Annals of Nuclear Energy, 2017

SANDY is a nuclear data sampling code compatible with nuclear data files in ENDF-6 format. Exploiting the basic theory of stochastic sampling, SANDY generates random nuclear data samples that reproduce the covariance information stored in the ENDF-6 files. Such random data are rewritten in perturbed ENDF-6 or PENDF files and can be used as nuclear code inputs to produce perturbed responses. After the statistical analysis of the sampled responses, the sample mean and variance are quantified. Not only can SANDY be used for the study of the response sample variance, but it can also estimate global sensitivity indices for correlated and uncorrelated input parameters using a variance-based decomposition method. SANDY was tested for the generation of random ENDF-6 files with perturbed cross sections and resonance parameters. These files were used for the uncertainty quantification of integral responses, such as the one-group integrated cross section and the resonance integral of several isotopes. Then, the response variance was apportioned to the several inputs and the parameters producing the largest impact were identified.

Stochastic vs. Sensitivity-Based Integral Parameter and Nuclear Data Adjustments

EPJ Plus, 2018

Developments in data assimilation theory allow to adjust integral parameters and cross sections with stochastic sampling. This work investigates how two stochastic methods, MOCABA and BMC, perform relative to a sensitivity-based methodology called GLLS. Stochastic data assimilation can treat integral parameters that behave non-linearly with respect to nuclear data perturbations, which would be an advantage over GLLS. Additionally, BMC is compatible with integral parameters and nuclear data that have non-Gaussian distributions. In this work, MOCABA and BMC are compared to GLLS for a simple test case: JEZEBEL-Pu239 simulated with Serpent2. The three methods show good agreement between the mean values and uncertainties of their posterior calculated values and nuclear data. The observed discrepancies are not statistically significant with a sample size of 10,000. BMC posterior calculated values and nuclear data have larger uncertainties than MOCABA's at equivalent sample sizes.

Influence of nuclear data parameters on integral experiment assimilation using Cook’s distance

EPJ Web of Conferences, 2019

Nuclear data used in designing of various nuclear applications (e.g., core design of reactors) is improved by using integral experiments. To utilize the past critical experimental data to the reactor design work, a typical procedure for the nuclear data adjustment is based on the Bayesian theory (least-square technique or Monte-Carlo). In this method, the nuclear data parameters are optimized by the inclusion of the experimental information using a Bayesian inference. The selection of integral experiments is based on the availability of well-documented specifications and experimental data. Data points with large uncertainties or large residuals (outliers) may affect the accuracy of the adjustment. Hence, in the adjustment process, it is very important to study the influence of experiments as well as of the prior nuclear data on the adjusted results. In this work, the influence of each individual reaction (related to nuclear data) is analyzed using the concept of Cook’s distance. Fir...

Re-estimation of Nuclear Data and JEFF 3.1. 1 Uncertainty Calculations

2012

This paper describes the method to define relevant targeted integral measurements that allow the improvement of nuclear data evaluations and the determination of corresponding reliable covariances. 235 U and 56 Fe examples are pointed out for the improvement of JEFF3 data. Utilizations of these covariances are shown for Sensitivity and Representativity studies, Uncertainty calculations, and Transposition of experimental results to industrial applications.

Analysis of Integral Benchmark Experiments with Latest Evaluated Nuclear Data

Journal of Nuclear Science and Technology, 2002

A detailed analysis of integral benchmark experiments using the latest evaluated nuclear data files from FENDL-2.0, ENDF/B-VI, JEF-2.2, JENDL-3.2, BROND-2.1 and CENDL-2.0 is performed to calculate the tri tium breeding ratio in Li, Pb-Li, Pb-Li-C, Be-Li and Be-Li-C spheres using a Monte Carlo technique. Tritium breeding mechanism in each of these spherical shell configurations is analysed and explained by comparing the tritium breeding reaction cross section in 7Li and 6Li from all the evaluated nuclear data files.

Methodology investigations on uncertainties propagation in nuclear data evaluation

Nuclear Engineering and Design, 2012

In this paper, we discuss distinct ways of calculating nuclear cross-section variance-covariance matrices issued from the spherical optical model such as the deterministic sensitivity method and Monte Carlo method. We also discuss the selection of uncertain parameters using both the local sensitivity and global sensitivity methods (based on spectral methods using a polynomial decomposition) to determine the importance of the input parameters with respect to the final uncertainty. Finally, we attempt to evaluate the differences between the uncertainties generated by methods like least squares method and Bayesian method, both necessitating the use of experimental data. Discussing the results of simulation for the cross-section calculations of the major neutron reactions on 89 Y, our interest was particularly focused on the mathematical rigour and quality of the uncertainties that we obtained.

International Integral Experiments Databases in Support of Nuclear Data and Code Validation

Journal of Nuclear Science and Technology, 2002

The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: -SINBAD -A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding -ICSBEP -International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. -IRPhEP -International Reactor Physics Experimental Benchmarks Evaluation Project The different projects are described in the following including results achieved, work in progress and planned.