Nuclear data assimilation, scientific basis and current status (original) (raw)

Influence of nuclear data parameters on integral experiment assimilation using Cook’s distance

EPJ Web of Conferences, 2019

Nuclear data used in designing of various nuclear applications (e.g., core design of reactors) is improved by using integral experiments. To utilize the past critical experimental data to the reactor design work, a typical procedure for the nuclear data adjustment is based on the Bayesian theory (least-square technique or Monte-Carlo). In this method, the nuclear data parameters are optimized by the inclusion of the experimental information using a Bayesian inference. The selection of integral experiments is based on the availability of well-documented specifications and experimental data. Data points with large uncertainties or large residuals (outliers) may affect the accuracy of the adjustment. Hence, in the adjustment process, it is very important to study the influence of experiments as well as of the prior nuclear data on the adjusted results. In this work, the influence of each individual reaction (related to nuclear data) is analyzed using the concept of Cook’s distance. Fir...

CONRAD – a code for nuclear data modeling and evaluation

EPJ Nuclear Sciences & Technologies, 2021

The CONRAD code is an object-oriented software tool developed at CEA since 2005. It aims at providing nuclear reaction model calculations, data assimilation procedures based on Bayesian inference and a proper framework to treat all uncertainties involved in the nuclear data evaluation process: experimental uncertainties (statistical and systematic) as well as model parameter uncertainties. This paper will present the status of CONRAD-V1 developments concerning the theoretical and evaluation aspects. Each development is illustrated with examples and calculations were validated by comparison with existing codes (SAMMY, REFIT, ECIS, TALYS) or by comparison with experiment. At the end of this paper, a general perspective for CONRAD (concerning the evaluation and theoretical modules) and actual developments will be presented.

Basic Data, Computer Codes and Integral Experiments: The Tools for Modelling in Nuclear Technology

When studying applications in nuclear technology we need to understand and be able to predict the behaviour of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are:

Nuclear data adjustment using Bayesian inference, diagnostics for model fit and influence of model parameters

EPJ Web of Conferences, 2020

The mathematical models used for nuclear data evaluations contain a large number of theoretical parameters that are usually uncertain. These parameters can be calibrated (or improved) by the information collected from integral/differential experiments. The Bayesian inference technique is used to utilize measurements for data assimilation. The Bayesian approximation is based on the least-square or Monte-Carlo approaches. In this process, the model parameters are optimized. In the adjustment process, it is essential to include the analysis related to the influence of model parameters on the adjusted data. In this work, some statistical indicators such as the concept of Cook’s distance; Akaike, Bayesian and deviance information criteria; effective degrees of freedom are developed within the CONRAD platform. Further, these indicators are applied to a test case of 155Gd to evaluate and compare the influence of resonance parameters.

Mapper – a Novel Capability to Support Nuclear Model Validation and Mapping of Biases and Uncertainties

Epj Web of Conferences, 2021

This paper overviews the initial results of a new project at the Oak Ridge National Laboratory, supported via an internal seed funding program, to develop a novel computational capability for model validation: MAPPER. MAPPER will eliminate the need for empirical criteria such as the similarity indices often employed to identify applicable experiments for given application conditions. To achieve this, MAPPER uses an information-theoretic approach based on the Kullback-Leibler (KL) divergence principle to combine responses of available or planned experiments with application responses of interest. This is accomplished with a training set of samples generated using randomized experiment execution and application of high-fidelity analysis models. These samples are condensed using reduced order modeling techniques in the form of a joint probability distribution function (PDF) connecting each application response of interest with a new effective experimental response. MAPPER’s initial obj...

Stochastic vs. Sensitivity-Based Integral Parameter and Nuclear Data Adjustments

EPJ Plus, 2018

Developments in data assimilation theory allow to adjust integral parameters and cross sections with stochastic sampling. This work investigates how two stochastic methods, MOCABA and BMC, perform relative to a sensitivity-based methodology called GLLS. Stochastic data assimilation can treat integral parameters that behave non-linearly with respect to nuclear data perturbations, which would be an advantage over GLLS. Additionally, BMC is compatible with integral parameters and nuclear data that have non-Gaussian distributions. In this work, MOCABA and BMC are compared to GLLS for a simple test case: JEZEBEL-Pu239 simulated with Serpent2. The three methods show good agreement between the mean values and uncertainties of their posterior calculated values and nuclear data. The observed discrepancies are not statistically significant with a sample size of 10,000. BMC posterior calculated values and nuclear data have larger uncertainties than MOCABA's at equivalent sample sizes.

NUCLEAR DATA STATISTICAL TREATMENT

Several methods have been developed in order to evaluate the best fit for nuclear data parameters, these methods relies on sequence logical steps should be followed to get accurate and reliable results, it subdivided into: (1) the physical model have been used (2) data types (3) statistical methods (4) problems. This paper will discuss the statistical methods used to evaluate the best fit for the nuclear data.

New Work on Updating and Extending the Nuclear Data Standards

Journal of ASTM International, 2012

An IAEA Data Development Project was initiated to provide a mechanism for allowing new experimental data and improvements in evaluation procedure to be incorporated into new evaluations of the standards. The work on his project is ongoing. In the past very long periods sometimes occurred between evaluations of the standards. Through the use of this Project, such long periods should not occur. Work on the cross section standards through this Project has included an update of the experimental data to be used in the cross section standards evaluations, a study of the uncertainties obtained in the international standards evaluation; and improvements in the smoothing procedure for capture cross sections. It was decided that this Project should have a broader range of activities than just the cross section standards and thus encompass standards related activities. The following are being investigated: improvements in the gold cross section at energies below where it is considered a standard; work on certain cross sections which are not as well known as the cross section standards but could be very useful as reference cross sections relative to which certain types of cross section measurements can be made. This work includes prompt gamma-ray production in fast neutron-induced reactions; and work on the 252 Cf spontaneous fission neutron spectrum and the 235 U thermal neutron fission spectrum. Most of the data investigated through this Project are used in dosimetry applications.

International Integral Experiments Databases in Support of Nuclear Data and Code Validation

Journal of Nuclear Science and Technology, 2002

The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: -SINBAD -A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding -ICSBEP -International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. -IRPhEP -International Reactor Physics Experimental Benchmarks Evaluation Project The different projects are described in the following including results achieved, work in progress and planned.