A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer (original) (raw)
Related papers
Accurate Fourth-Generation Machine Learning Potentials by Electrostatic Embedding
Journal of Chemical Theory and Computation, 2023
In recent years, significant progress has been made in the development of machine learning potentials (MLPs) for atomistic simulations with applications in many fields from chemistry to materials science. While most current MLPs are based on environment-dependent atomic energies, the limitations of this locality approximation can be overcome, e.g., in fourth-generation MLPs, which incorporate long-range electrostatic interactions based on an equilibrated global charge distribution. Apart from the considered interactions, the quality of MLPs crucially depends on the information available about the system, i.e., the descriptors. In this work we show that including-in addition to structural information-the electrostatic potential arising from the charge distribution in the atomic environments significantly improves the quality and transferability of the potentials.
Based on an analysis of the short range chemical environment of each atom in a system, standard machine learning based approaches to the construction of interatomic potentials aim at determining directly the central quantity which is the total energy. This prevents for instance an accurate description of the energetics of systems where long range charge transfer is important as well as of ionized systems. We propose therefore not to target directly with machine learning methods the total energy but an intermediate physical quantity namely the charge density, which then in turn allows to determine the total energy. By allowing the electronic charge to distribute itself in an optimal way over the system, we can describe not only neutral but also ionized systems with unprecedented accuracy. We demonstrate the power of our approach for both neutral and ionized NaCl clusters where charge redistribution plays a decisive role for the energetics. We are able to obtain chemical accuracy, i.e. errors of less than a milli Hartree per atom compared to the reference density functional results. The introduction of physically motivated quantities which are determined by the short range atomic environment via a neural network leads also to an increased stability of the machine learning process and transferability of the potential.
ChemRxiv, 2021
Transferable high dimensional neural network potentials (HDNNP) have shown great promise as an avenue to increase the accuracy and domain of applicability of existing atomistic force fields for organic systems relevant to life science. We have previously reported such a potential (Schrodinger-ANI) that has broad coverage of druglike molecules. We extend that work here to cover ionic and zwitterionic druglike molecules expected to be relevant to drug discovery research activities. We report a novel HDNNP architechture, which we call QRNN, that predicts atomic charges and uses these charges as descriptors in an energy model which delivers conformational energies within chemical accuracy when measured against the reference theory it is trained to. Further, we find that delta learning based on a semi-empirical level of theory approximately halves the errors. We test the models on torsion energy profiles, relative conformational energies, geometric parameters and relative tautomer errors.
Transferable Molecular Charge Assignment Using Deep Neural Networks
arXiv: Chemical Physics, 2018
We use HIP-NN, a neural network architecture that excels at predicting molecular energies, to predict atomic charges. The charge predictions are accurate over a wide range of molecules (both small and large) and for a diverse set of charge assignment schemes. To demonstrate the power of charge prediction on non-equilibrium geometries, we use HIP-NN to generate IR spectra from dynamical trajectories on a variety of molecules. The results are in good agreement with reference IR spectra produced by traditional theoretical methods. Critically, for this application, HIP-NN charge predictions are about 104 times faster than direct DFT charge calculations. Thus, ML provides a pathway to greatly increase the range of feasible simulations while retaining quantum-level accuracy. In summary, our results provide further evidence that machine learning can replicate high-level quantum calculations at a tiny fraction of the computational cost.
The Rise of Neural Networks for Materials and Chemical Dynamics
The Journal of Physical Chemistry Letters
Machine learning (ML) is quickly becoming a premier tool for modeling chemical processes and materials. ML-based force fields, trained on large data sets of high-quality electron structure calculations, are particularly attractive due their unique combination of computational efficiency and physical accuracy. This Perspective summarizes some recent advances in the development of neural network-based interatomic potentials. Designing high-quality training data sets is crucial to overall model accuracy. One strategy is active learning, in which new data are automatically collected for atomic configurations that produce large ML uncertainties. Another strategy is to use the highest levels of quantum theory possible. Transfer learning allows training to a data set of mixed fidelity. A model initially trained to a large data set of density functional theory calculations can be significantly improved by retraining to a relatively small data set of expensive coupled cluster theory calculations. These advances are exemplified by applications to molecules and materials.
Teaching a neural network to attach and detach electrons from molecules
Nature Communications, 2021
Interatomic potentials derived with Machine Learning algorithms such as Deep-Neural Networks (DNNs), achieve the accuracy of high-fidelity quantum mechanical (QM) methods in areas traditionally dominated by empirical force fields and allow performing massive simulations. Most DNN potentials were parametrized for neutral molecules or closed-shell ions due to architectural limitations. In this work, we propose an improved machine learning framework for simulating open-shell anions and cations. We introduce the AIMNet-NSE (Neural Spin Equilibration) architecture, which can predict molecular energies for an arbitrary combination of molecular charge and spin multiplicity with errors of about 2–3 kcal/mol and spin-charges with error errors ~0.01e for small and medium-sized organic molecules, compared to the reference QM simulations. The AIMNet-NSE model allows to fully bypass QM calculations and derive the ionization potential, electron affinity, and conceptual Density Functional Theory q...
A systematic approach to generating accurate neural network potentials: the case of carbon
npj Computational Materials
Availability of affordable and widely applicable interatomic potentials is the key needed to unlock the riches of modern materials modeling. Artificial neural network-based approaches for generating potentials are promising; however, neural network training requires large amounts of data, sampled adequately from an often unknown potential energy surface. Here we propose a self-consistent approach that is based on crystal structure prediction formalism and is guided by unsupervised data analysis, to construct an accurate, inexpensive, and transferable artificial neural network potential. Using this approach, we construct an interatomic potential for carbon and demonstrate its ability to reproduce first principles results on elastic and vibrational properties for diamond, graphite, and graphene, as well as energy ordering and structural properties of a wide range of crystalline and amorphous phases.
Exclusively prioritizing the precision of energy prediction frequently proves inadequate in satisfying multifaceted requirements. A heightened focus is warranted on assessing the rationality of potential energy curves predicted by machine learning-based force fields (MLFF), alongside evaluating the pragmatic utility of these MLFF. This study introduces SWANI, an optimized Neural Network Potential (NNP) stemming from the ANI framework. Through the incorporation of supplementary physical constraints, SWANI aligns more cohesively with chemical expectations, yielding rational potential energy profiles. It also exhibits superior predictive precision compared to the ANI model. Additionally, a comprehensive comparison is conducted between SWANI and a prominent Graph Neural Network (GNN)-based model. The findings indicate that SWANI outperforms the latter, particularly for molecules exceeding the dimensions of the training set. This outcome underscores SWANI's exceptional capacity for g...
Physical Review B, 2017
We investigate the accuracy and transferability of a recently developed high-dimensional neural network (NN) method for calcium fluoride, fitted to a database of ab initio density functional theory (DFT) calculations based on the Perdew-Burke-Ernzerhof (PBE) exchange correlation functional. We call the method charge equilibration via neural network technique (CENT). Although the fitting database contains only clusters (i.e., nonperiodic structures), the NN scheme accurately describes a variety of bulk properties. In contrast to other available empirical methods the CENT potential has a much simpler functional form, nevertheless it correctly reproduces the PBE energetics of various crystalline phases both at ambient and high pressure. Surface energies and structures as well as dynamical properties derived from phonon calculations are also in good agreement with PBE results. Overall, the difference between the values obtained by the CENT potential and the PBE reference values is less than or equal to the difference between the values of local density approximation (LDA) and Born-Mayer-Huggins (BMH) with those calculated by the PBE exchange correlation functional.