Artificial neural network potentials: Difference between revisions

From SklogWiki
Jump to navigation Jump to search
m (→‎Applications: Fixed typo)
m (→‎References: Added a recent publication)
Line 20: Line 20:
*[http://dx.doi.org/10.1002/qua.24890 Jörg Behler "Constructing high-dimensional neural network potentials: A tutorial review", International Journal of Quantum Chemistry '''115''' pp. 1032-1050 (2015)]
*[http://dx.doi.org/10.1002/qua.24890 Jörg Behler "Constructing high-dimensional neural network potentials: A tutorial review", International Journal of Quantum Chemistry '''115''' pp. 1032-1050 (2015)]
*[http://dx.doi.org/10.1063/1.4966192 Jörg Behler "Perspective: Machine learning potentials for atomistic simulations", Journal of Chemical Physics '''145''' 170901 (2016)]
*[http://dx.doi.org/10.1063/1.4966192 Jörg Behler "Perspective: Machine learning potentials for atomistic simulations", Journal of Chemical Physics '''145''' 170901 (2016)]
*[https://doi.org/10.1063/1.5027645 Linfeng Zhang, Jiequn Han, Han Wang, Roberto Car, and Weinan E "DeePCG: Constructing coarse-grained models via deep neural networks", Journal of Chemical Physics '''149''' 034101 (2018)]


[[category:models]]
[[category:models]]

Revision as of 15:06, 23 July 2018

Artificial neural network potentials (ANNP). Neural networks (NN) are used more and more for a wide array of applications. Here we are concerned with a more narrow application; their use in fitting [1] [2] to an atomic or molecular potential energy surface. In particular the output layer, or node, provides an energy as a function of the coordinates, which form the input layer.

Activation functions

Training

Example

The output of a feedforward NN, having a single layer of hidden neurons, each having a sigmoid activation function and a linear output neuron, is given by:

Applications

Since the early work of Blank et al. [3] ANNS have been sucessfully developed for water [4], Al3+ ions dissolved in water [5], aqueous NaOH solutions [6], gold nanoparticles [7] as well as many other systems [8][9].

References

  1. G. Cybenko "Approximation by superpositions of a sigmoidal function", Mathematics of Control, Signals and Systems 2 pp. 303-314 (1989)
  2. Kurt Hornik, Maxwell Stinchcombe, Halbert White "Multilayer feedforward networks are universal approximators", Neural Networks 2 pp. 359-366 (1989)
  3. Thomas B. Blank, Steven D. Brown, August W. Calhoun, and Douglas J. Doren "Neural network models of potential energy surfaces", Journal of Chemical Physics 103 4129 (1995)
  4. Tobias Morawietz, Andreas Singraber, Christoph Dellago, and Jörg Behler "How van der Waals interactions determine the unique properties of water", PNAS 113 pp. 8368-8373 (2016)
  5. Helmut Gassner, Michael Probst, Albert Lauenstein, and Kersti Hermansson "Representation of Intermolecular Potential Functions by Neural Networks", Journal of Physical Chemistry A 102 pp. 4596-4605 (1998)
  6. Matti Hellström and Jörg Behler "Structure of aqueous NaOH solutions: insights from neural-network-based molecular dynamics simulations", Physical Chemistry Chemical Physics 19 pp. 82-96 (2017)
  7. Siva Chiriki, Shweta Jindal, and Satya S. Bulusu "Neural network potentials for dynamics and thermodynamics of gold nanoparticles", Journal of Chemical Physics 146 084314 (2017)
  8. Sönke Lorenz, Axel Groß and Matthias Scheffler "Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks", Chemical Physics Letters 395 pp. 210-215 (2004)
  9. Sergei Manzhos, Xiaogang Wang, Richard Dawes, and Tucker Carrington Jr. "A Nested Molecule-Independent Neural Network Approach for High-Quality Potential Fits", Journal of Physical Chemistry A 110 pp. 5295-5304 (2006)
Related reading