Skip to main content
. 2017 Jan 9;8:13890. doi: 10.1038/ncomms13890

Figure 1. Prediction and explanation of molecular energies with a deep tensor neural network.

Figure 1

(a) Molecules are encoded as input for the neural network by a vector of nuclear charges and an inter-atomic distance matrix. This description is complete and invariant to rotation and translation. (b) Illustration of the network architecture. Each atom type corresponds to a vector of coefficients Inline graphic, which is repeatedly refined by interactions vij. The interactions depend on the current representation Inline graphic, as well as the distance Dij to an atom j. After T iterations, an energy contribution Ei is predicted for the final coefficient vector Inline graphic. The molecular energy E is the sum over these atomic contributions. (c) Mean absolute errors of predictions for the GDB-9 dataset of 133,885 molecules as a function of the number of atoms. The employed neural network uses two interaction passes (T=2) and 50,000 reference calculation during training. The inset shows the error of an equivalent network trained on 5,000 GDB-9 molecules with 20 or more atoms, as small molecules with 15 or less atoms are added to the training set. (d) Extract from the calculated (black) and predicted (orange) molecular dynamics trajectory of toluene. The curve on the right shows the agreement of the predicted and calculated energy distributions. (e) Energy contribution Eprobe (or local chemical potential Inline graphic, see text) of a hydrogen test charge on a Inline graphic isosurface for various molecules from the GDB-9 dataset for a DTNN model with T=2.