Neural networks to approach potential energy surfaces: Application to a molecular dynamics simulation (original) (raw)

Approach to potential energy surfaces by neural networks. A review of recent work

International Journal of Quantum Chemistry, 2010

In the last years, Neural Networks (NNs) turned out as a suitable approach to map accurate Potential Energy Surfaces (PES) from ab initio/DFT energy data sets. PES are crucial to study reactive and nonreactive chemical systems by Monte Carlo (MC) or Molecular Dynamics (MD) simulations. Here we present a review of (a) the main achievements, from the literature, on the use of NNs to obtain PES and (b) our recent work, analyzing and discussing models to map PES, and adding a few details not reported in our previous publications. Two different models are considered. First, NNs trained to reproduce PES represented by the Lennard–Jones (LJ) potential function. Second, the mapping of multidimensional PES to simulate, by MD or MC, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes, focusing the ethanol/Au (111) interface. In both cases, it is shown that NNs can be trained to map PES with similar accuracy than analytical representations. The results are relevant in the second case, in which simulations by MC or MD require an extensive screening of the interaction sites at the interface, turning the development of analytical functions a nontrivial task as the complexity of the systems increases. © 2009 Wiley Periodicals, Inc. Int J Quantum Chem, 2010

Enhancing Molecular Energy Predictions with Physically Constrained Modifications to the Neural Network Potential

Exclusively prioritizing the precision of energy prediction frequently proves inadequate in satisfying multifaceted requirements. A heightened focus is warranted on assessing the rationality of potential energy curves predicted by machine learning-based force fields (MLFF), alongside evaluating the pragmatic utility of these MLFF. This study introduces SWANI, an optimized Neural Network Potential (NNP) stemming from the ANI framework. Through the incorporation of supplementary physical constraints, SWANI aligns more cohesively with chemical expectations, yielding rational potential energy profiles. It also exhibits superior predictive precision compared to the ANI model. Additionally, a comprehensive comparison is conducted between SWANI and a prominent Graph Neural Network (GNN)-based model. The findings indicate that SWANI outperforms the latter, particularly for molecules exceeding the dimensions of the training set. This outcome underscores SWANI's exceptional capacity for g...

The use of neural networks for fitting potential energy surfaces: A comparative case study for theH+3 molecule

International Journal of Quantum Chemistry, 2003

The fitting of ab initio electronic energies of polyatomic molecules for different nuclear configurations is an active field in quantum chemistry and is an important step in the study of chemical reaction dynamics and for the determination of rovibrational spectra. The choice of a good-fitting function and the decision as to which geometries are relevant for the problem remains a matter of feeling as a large number of ab initio points of good quality usually involves prohibitively large amounts of CPU times. More recently, the use of neural networks has drawn some attention for fitting potential energy surfaces (PES). Neural networks are generic function approximators for any required accuracy and are therefore well suited for fitting many-dimensional PES. In this work we present a comparative study for the ground state PES of the H3 ϩ molecule obtained fitting state-of-the-art ab initio points. The PES is obtained using both a neural network and a polynomial function in Morse-type symmetry-adapted coordinates. The quality of the surfaces is asserted by computing the associated rovibrational spectra. The resulting energies are compared with known experimental results.

Neural Network Potentials for Reactive Chemistry: CASPT2 Quality Potential Energy Surfaces for Bond Breaking

Neural Network potentials are developed which accurately make and break bonds for use in molecular simulations. We report a neural network potential that can describe the potential energy surface for carbon-carbon bond dissociation with less than 1 kcal/mol error compared to complete active space second-order perturbation theory (CASPT2), and maintains this accuracy for both the minimum energy path and molecular dynamic calculations up to 2000K. We utilize a transfer learning algorithm to develop neural network potentials to generate potential energy surfaces; this method aims to use the minimum amount of CASPT2 data on small systems to train neural network potentials while maintaining excellent transferability to larger systems. First, we generate homolytic carbon-carbon bond dissociation data of small size alkanes with density functional theory (DFT) energies to train the potentials to accurately predict bond dissociation at the DFT level. Then, using transfer learning, we retrain...

Neural networks as a tool for compact representation of ab initio molecular potential energy surfaces

Journal of Molecular Graphics, 1996

Ab initio quantum chemical calculations q(molecular I.'operties sttc/# ~l,V, e.g., lorsio#zal potential energies, require massive COmlmtational effbrl even.fi)r #noderately sized molecule.s, i['hasis sets with a reasonable quality are employed. Using ab initio data on coqfi)rmational properties of the co.fitctor (6R, 1'R,2'S)-5,6, 7,8-tetrahydrobiopterin, we de#no#rstrate t/tat error backl?ropagation networks can be established that ~ffi'cie#ttly alg~roximate complicated.fimctional rehtlionshil?s such as torsional potential energy su#aces of a,flexible molecule. Our pilot simulations suggest that properlv traim'd neural networks #night provide an extremely co#npact .slorage medium .lot qmmtu#n chemically obtained iqfi,rntation. Moreot'ep; they are outstandingly con~f'orlable tools w/le#l it comes to making use qf the stored infi)#wlation. One l~ossilde algdication is demotzstrated, namely, co#hireration ~?/' rela vvd torsional energ.v sur/hces.

Neural network-based approaches for building high dimensional and quantum dynamics-friendly potential energy surfaces

International Journal of Quantum Chemistry, 2014

Development and applications of neural network (NN)-based approaches for representing potential energy surfaces (PES) of bound and reactive molecular systems are reviewed. Specifically, it is shown that when the density of ab initio points is low, NNs-based potentials with multibody or multimode structure are advantageous for representing high-dimensional PESs. Importantly, with an appropriate choice of the neuron activation function, PESs in the sum-of-products form are naturally obtained, thus addressing a bottleneck problem in quantum dynamics. The use of NN committees is also analyzed and it is shown that while they are able to reduce the fitting error, the reduction is limited by the nonrandom nature of the fitting error. The approaches described here are expected to be directly applicable in other areas of science and engineering where a functional form needs to be constructed in an unbiased way from sparse data.

The Rise of Neural Networks for Materials and Chemical Dynamics

The Journal of Physical Chemistry Letters

Machine learning (ML) is quickly becoming a premier tool for modeling chemical processes and materials. ML-based force fields, trained on large data sets of high-quality electron structure calculations, are particularly attractive due their unique combination of computational efficiency and physical accuracy. This Perspective summarizes some recent advances in the development of neural network-based interatomic potentials. Designing high-quality training data sets is crucial to overall model accuracy. One strategy is active learning, in which new data are automatically collected for atomic configurations that produce large ML uncertainties. Another strategy is to use the highest levels of quantum theory possible. Transfer learning allows training to a data set of mixed fidelity. A model initially trained to a large data set of density functional theory calculations can be significantly improved by retraining to a relatively small data set of expensive coupled cluster theory calculations. These advances are exemplified by applications to molecules and materials.

Aladyn - Adaptive Neural Network Molecular Dynamics Simulation Code: Computational Materials Mini-Application

2018

This report provides an overview and commands description of the Computational Materials mini-application, Aladyn. Aladyn is a simple molecular dynamics code written in FORTRAN 2008, which is designed to demonstrate the use of adaptive neural networks (ANNs) in atomistic simulations. The role of ANNs is to reproduce the very complex energy landscape resulting from the atomic interactions in materials with the accuracy of quantum mechanics-based energy calculations. The ANN is trained on a large set of atomic structures calculated using the density functional theory (DFT) method. The Aladyn code is being released to serve as a training testbed for students and professors in academia to explore possible optimization algorithms for parallel computing on multicore central processing unit (CPU) computers or computers utilizing many core architectures based on graphic processing units (GPUs). The effort is related to the High Performance Computing Incubator (HPCI) project at NASA Langley ...

Bond Energies from a Diatomics-in-Molecules Neural Network

arXiv: Chemical Physics, 2017

Neural networks are being used to make new types of empirical chemical models as inexpensive as force fields, but with accuracy close to the ab-initio methods used to build them. Besides modeling potential energy surfaces, neural-nets can provide qualitative insights and make qualitative chemical trends quantitatively predictable. In this work we present a neural-network that predicts the energies of molecules as a sum of bond energies. The network learns the total energies of the popular GDB9 dataset to a competitive MAE of 0.94 kcal/mol. The method is naturally linearly scaling, and applicable to molecules of nanoscopic size. More importantly it gives chemical insight into the relative strengths of bonds as a function of their molecular environment, despite only being trained on total energy information. We show that the network makes predictions of relative bond strengths in good agreement with measured trends and human predictions. We show that DIM-NN learns the same heuristic t...

Transferable Neural Network Potential Energy Surfaces for Closed-Shell Organic Molecules: Extension to Ions

ChemRxiv, 2021

Transferable high dimensional neural network potentials (HDNNP) have shown great promise as an avenue to increase the accuracy and domain of applicability of existing atomistic force fields for organic systems relevant to life science. We have previously reported such a potential (Schrodinger-ANI) that has broad coverage of druglike molecules. We extend that work here to cover ionic and zwitterionic druglike molecules expected to be relevant to drug discovery research activities. We report a novel HDNNP architechture, which we call QRNN, that predicts atomic charges and uses these charges as descriptors in an energy model which delivers conformational energies within chemical accuracy when measured against the reference theory it is trained to. Further, we find that delta learning based on a semi-empirical level of theory approximately halves the errors. We test the models on torsion energy profiles, relative conformational energies, geometric parameters and relative tautomer errors.