Energy landscapes for a machine learning application to series data (original) (raw)
Related papers
Perspective: Energy Landscapes for Machine Learning
Machine learning techniques are being increasingly used as flexible non-linear fitting and prediction tools in the physical sciences. Fitting functions that exhibit multiple solutions as local minima can be analysed in terms of the corresponding machine learning landscape. Methods to explore and visualise molecular potential energy landscapes can be applied to these machine learning landscapes to gain new insight into the solution space involved in training and the nature of the corresponding predictions. In particular, we can define quantities analogous to molecular structure, thermodynamics, and kinetics, and relate these emergent properties to the structure of the underlying landscape. This Perspective aims to describe these analogies with examples from recent applications, and suggest avenues for new interdisciplinary research.
Energy Landscapes of Atomic Clusters as Black Box Optimization Benchmarks
Evolutionary Computation, 2012
We present the energy minimization of atomic clusters as a promising problem class for continuous black-box optimization benchmarks. Finding the arrangement of atoms that minimizes a given potential energy is a specific instance of the more general class of geometry optimization or packing problems, which are generally NP-complete. Atomic clusters are a well-studied subject in physics and chemistry. From the large set of available cluster optimization problems, we propose two specific instances: Cohn-Kumar clusters and Lennard-Jones clusters. The potential energies of these clusters are governed by distance-dependent pair-wise interaction potentials. The resulting collection of landscapes comprises smooth and rugged single-funnel topologies, as well as tunable double-funnel topologies. In addition, all problems possess a feature that is not covered by the synthetic functions in current black-box optimization test suites: isospectral symmetry. This property implies that any atomic arrangement is uniquely defined by the pair-wise distance spectrum, rather than the absolute atomic positions. We hence suggest that the presented problem instances should be included in black-box optimization benchmark suites.
The Journal of Chemical Physics
Machine learning of atomic-scale properties is revolutionizing molecular modelling, making it possible to evaluate inter-atomic potentials with first-principles accuracy, at a fraction of the costs. The accuracy, speed and reliability of machine-learning potentials, however, depends strongly on the way atomic configurations are represented, i.e. the choice of descriptors used as input for the machine learning method. The raw Cartesian coordinates are typically transformed in "fingerprints", or "symmetry functions", that are designed to encode, in addition to the structure, important properties of the potential-energy surface like its invariances with respect to rotation, translation and permutation of like atoms. Here we discuss automatic protocols to select a number of fingerprints out of a large pool of candidates, based on the correlations that are intrinsic to the training data. This procedure can greatly simplify the construction of neural network potentials that strike the best balance between accuracy and computational efficiency, and has the potential to accelerate by orders of magnitude the evaluation of Gaussian Approximation Potentials based on the Smooth Overlap of Atomic Positions kernel. We present applications to the construction of neural network potentials for water and for an Al-Mg-Si alloy, and to the prediction of the formation energies of small organic molecules using Gaussian process regression.
Solving Minimum Energy Structures with Neural Networks
2017
In this paper, we train a neural network on atomic configurations to predict energy as a function of atom position, then use this neural network to perform optimization to solve for minimum energy atomic configuration. This is a problem of interest because it could potentially provide a boost to both accuracy and speed over traditional numeric methods of solving for the structure of molecules. Previous papers have shown that neural networks trained on the results of numerical simulations can reproduce those results to high accuracy. We construct one such neural network and experiment with new methods of optimizing its parameters, then use it as a function to optimize in order to find the minimum energy configuration for a systems of a few homonuclear atoms. The results are promising, with our neural network accuracy beating that of the baseline neural network for the problem, and the minimization results showing an improvement over the data the neural network was trained on.
International Journal of Quantum Chemistry, 2003
The fitting of ab initio electronic energies of polyatomic molecules for different nuclear configurations is an active field in quantum chemistry and is an important step in the study of chemical reaction dynamics and for the determination of rovibrational spectra. The choice of a good-fitting function and the decision as to which geometries are relevant for the problem remains a matter of feeling as a large number of ab initio points of good quality usually involves prohibitively large amounts of CPU times. More recently, the use of neural networks has drawn some attention for fitting potential energy surfaces (PES). Neural networks are generic function approximators for any required accuracy and are therefore well suited for fitting many-dimensional PES. In this work we present a comparative study for the ground state PES of the H3 ϩ molecule obtained fitting state-of-the-art ab initio points. The PES is obtained using both a neural network and a polynomial function in Morse-type symmetry-adapted coordinates. The quality of the surfaces is asserted by computing the associated rovibrational spectra. The resulting energies are compared with known experimental results.
2019
Simulation of molecular and crystal systems enables insight into interesting chemical properties that benefit processes ranging from drug discovery to material synthesis. However these simulations can be computationally expensive and time consuming despite the approximations through Density Functional Theory (DFT). We propose the Valence Interaction Message Passing Neural Network (VIMPNN) to approximate DFT’s ground-state energy calculations. VIMPNN integrates physics prior knowledge such as the existence of different interatomic bounds to estimate more accurate energies. Furthermore, while many previous machine learning methods consider only stable systems, our proposed method is demonstrated on unstable systems at different atomic distances. VIMPNN predictions can be used to determine the stable configurations of systems, i.e. stable distance for atoms – a necessary step for the future simulation of crystal growth for example. Our method is extensively evaluated on a augmented ver...
Machine learning of ab-initio energy landscapes for crystal structure predictions
Computational Materials Science, 2018
We present a machine learning approach to calculate unrelaxed and relaxed formation energies of compounds relative to the ground state crystal structure of the pure components in the context of structure predictions in binary systems. Typical methods for structure predictions such as genetic algorithms often rely on densityfunctional theory codes to perform such calculations at a relatively high computational cost. In this work, we explore two commonly used kernel-based learning algorithms, kernel ridge regression and support vector regression. The efficiency of machine learning approaches relies on suitable data representations that encode the relevant physical information about the crystal structures. We select partial radial distribution functions to represent this structural information. We apply the machine learning approaches to the binary Li-Ge system and show that these methods provide small root-mean square prediction errors of about 20 meV/atom across the composition and structure space. Furthermore, we demonstrate that the model can be trained to predict the formation energies of the relaxed structures with the same accuracy when given unrelaxed structures as input. The high accuracy for the prediction of the relaxed energies of unrelaxed structures suggests that the machinelearning method can eliminate unlikely candidate structures from a genetic algorithm search, thus reducing the computational cost required for the explorations of energy landscapes and improving the performance of genetic algorithms for structure predictions.
Accelerated Prediction of Atomically Precise Cluster Structures Using On-the-fly Machine Learning
2021
The chemical and structural properties of atomically precise nanoclusters are of great interest in numerous applications, but predicting the stable structures of clusters can be computationally expensive. In this work, we present a procedure for rapidly predicting low-energy structures of nanoclusters by combining a genetic algorithm with interatomic potentials actively learned on-the-fly. Applying this approach to aluminum clusters with 21 to 55 atoms, we have identified structures with lower energy than any reported in the literature for 25 out of the 35 sizes. Our benchmarks indicate that the active learning procedure accelerated the average search speed by more than an order of magnitude relative to genetic algorithm searches using only density functional calculations. This work demonstrates a feasible way to systematically discover stable structures for large nanoclusters and provides insights into the transferability of machine-learned interatomic potentials for nanoclusters.