Skip to main content
Chemical Science logoLink to Chemical Science
. 2019 Sep 9;10(41):9424–9432. doi: 10.1039/c9sc02696g

Electron density learning of non-covalent systems

Alberto Fabrizio 1,2, Andrea Grisafi 3,2, Benjamin Meyer 1,2, Michele Ceriotti 3,2, Clemence Corminboeuf 1,2,
PMCID: PMC6991182  PMID: 32055318

Abstract

Chemists continuously harvest the power of non-covalent interactions to control phenomena in both the micro- and macroscopic worlds. From the quantum chemical perspective, the strategies essentially rely upon an in-depth understanding of the physical origin of these interactions, the quantification of their magnitude and their visualization in real-space. The total electron density ρ(r) represents the simplest yet most comprehensive piece of information available for fully characterizing bonding patterns and non-covalent interactions. The charge density of a molecule can be computed by solving the Schrödinger equation, but this approach becomes rapidly demanding if the electron density has to be evaluated for thousands of different molecules or very large chemical systems, such as peptides and proteins. Here we present a transferable and scalable machine-learning model capable of predicting the total electron density directly from the atomic coordinates. The regression model is used to access qualitative and quantitative insights beyond the underlying ρ(r) in a diverse ensemble of sidechain–sidechain dimers extracted from the BioFragment database (BFDb). The transferability of the model to more complex chemical systems is demonstrated by predicting and analyzing the electron density of a collection of 8 polypeptides.


Machine learning model of the electron densities for analyzing non-covalent interaction patterns in peptides.graphic file with name c9sc02696g-ga.jpg

1. Introduction

Non-covalent interactions (NCIs) govern a multitude of chemical phenomena and are key components for constructing molecular architectures.1 Their importance fostered an intense research effort to accurately quantify their magnitude and develop an intuitive characterization of their physical nature using quantum chemistry.2–6 Among the different approaches to characterize non-covalent interactions, one of the simplest and most generally applicable takes as a starting point the electron density ρ(r) that encodes, in principle, all the information needed to fully characterize a chemical system.7 Despite the fact that the universal functional relationship between total energy and ρ(r) remains unknown, existing approximations within the framework of Kohn–Sham DFT (KS-DFT)8 do permit access to all molecular properties within a reasonable degree of accuracy.9–11

Properties that can be derived exactly from the electron density distribution include molecular and atomic electrostatic moments (e.g., charges, dipole, quadrupoles), electrostatic potentials and electrostatic interaction energies. Knowledge of these quantities is fundamental in diverse chemical applications, including the computation of the IR intensities,12 the identification of binding sites in host–guest compounds,13–15 and the exact treatment of electrostatics within molecular simulations.16 Moreover, analyzing the deformation of ρ(r) in the presence of an external field provides access to another set of fundamental properties, namely molecular static (hyper)polarizabilities and, thus, to the computation of Raman spectra17 and non-linear optical properties.18–21

The natural representation of the electron density in real space makes it especially suitable for accessing spatial information about structural and electronic molecular properties, including X-ray structure refinement22–27 and representations using scalar fields.6 Routinely used examples include the quantum theory of atoms in molecules (QTAIM),28,29 the density overlap region indicator (DORI),30 and the non-covalent interaction (NCI) index.31,32

ρ(r) is generally obtained by solving the electronic structure problem through ab initio computations. The main advantage of this approach is that it returns the variationally optimized electronic density for a given Hamiltonian. Yet, ab initio computations can become increasingly burdensome if ρ(r) has to be evaluated for thousands of different molecules or very large chemical systems, such as peptides and proteins. These large scale problems are typically tackled using a more scalable approach that consists of either using linear scaling techniques such as Mezey's molecular electron density LEGO assembler (MEDLA)33,34 and adjustable density matrix assembler (ADMA),35–37 as well as approaches based on localized molecular orbitals, such as ELMO.38–41 Another methodology belonging to this second category involves the use of experimental techniques, such as X-ray diffraction, to probe the electron density and subsequently reconstructing ρ(r) through multipolar models42–44 and pseudo-atomic libraries, such as ELMAM,45–48 ELMAM2,49,50 UBDB,51,52 Invarioms53 and SBFA.54 While successful, these two methodologies have intrinsic limits: the first is unable to capture the deformations of the charge density due to intermolecular interactions unless a suitable fragment is generated ad hoc, while the second relies on experimental data and is difficult to extend to thousands of different chemical systems at once. Recently, the development of several machine-learning models targeting the electron density has effectively established a third promising methodology, with the potential to overcome the limitations of the more traditional approaches.

The first machine-learning model of ρ(r) was developed on the basis of the Hohenberg–Kohn mapping between the nuclear potential and the electron density.55,56 Although successful, the choice of the nuclear potential as a representation of the different molecular conformations and the expansion of the electron density in an orthogonal plane-wave basis effectively constrained this landmark model to relatively small and rigid molecules with limited transferability to larger systems. Recently, we proposed an atom-centered, symmetry-adapted Gaussian process regression57 (SA-GPR) framework explicitly targeting the learning of the electron density.58 Using an optimized non-orthogonal basis set, pseudo-valence electron densities could be predicted in a linear-scaling and transferable manner, meaning that the model is able to tackle much larger chemical systems than those used to train the regression model. A third approach, that can also achieve transferability between different systems, uses a direct grid-based representation of the atomic environment to learn and predict the electronic density in each point of the molecular space.59–61 Representing the density field on a large set of grids points rather than on a basis set effectively avoids the introduction of a basis set error, but also dramatically increases the computational effort.

One should also consider that machine learning, being a data-driven approach, requires high-quality, diverse reference data. Fortunately, several specialized benchmark databases that target NCIs have appeared over the past decade. From the original S22 (ref. 62) to NCIE53,63 S66,64 NBC10/NBC10ext,65–67 and S12L,68,69 the evolution of these datasets has, generally, followed a prescription of increasing the number of entries, principally by including subtler interactions and/or larger systems. In this respect, the databases of Friesner,70 Head-Gordon,71 Shaw,72 and the recent BFDb of Sherrill,73 constitute a special category because of their exceptional size (reaching thousands of entries) which are now sufficiently large to be compatible with machine-learning applications. Beyond their conceptual differences, each of these benchmark sets aims at improving the capability of electronic structure methods to describe the energetic aspects of non-covalent interactions.

In this work, we introduce a dramatic improvement of our previous density-learning approach by making the regression machinery of ρ(r) compatible with density-fitting auxiliary basis sets. These specialized basis sets are routinely used in quantum chemistry to approximate two-center one-electron densities. Here, the auxiliary basis sets are used directly to represent the electron densities that enter our machine-learning model, with the additional advantage of avoiding the arbitrary basis set optimization procedures on the machine-learning side. This enhanced framework leverages the transferability of our symmetry-adapted regression method and is capable of learning the all-electron density across a vast spectrum of 2291 chemically diverse dimers formed by sidechain–sidechain interactions extracted from the BioFragment Database (BFDb).73 The performance of the method is demonstrated through the reproduction of ρ(r) between and within each monomer forming the dimers. The accuracy of the predicted densities is assessed by computing density-based scalar fields and electrostatic potentials, while the errors made with respect to the reference densities are computed by direct integration on three-dimensional grids. As a major breakthrough, the model is used to predict the charge density of a set of 8 polypeptides (∼100 atoms) at DFT accuracy in few minutes.

2. Methods

Gaussian process regression (GPR) can be extended to encode all the fundamental symmetries of the O(3) group, effectively allowing machine-learning of all the molecular properties that transform as spherical tensors under rotation and inversion operations.57,74 In the specific case of the electron density, the scheme relies upon the decomposition of the field into additive, atom-centered contributions and the subsequent prediction of the corresponding expansion coefficients.58 In SA-GPR, each molecule is represented as a collection of atom-centered environments, whose relationships and similarities are measured by symmetry adapted kernels. An in-depth discussion about how a symmetry adapted regression model of the electron density can be constructed is reported in the ESI.

The decomposition of the electron density in continuous atom-centered basis functions is the cornerstone of the scalability and transferability of our SA-GPR model. Besides being generally desirable, these properties are actually crucial to accurately describe the chemical diversity present in the BioFragment database within a reasonable computational cost. On the other hand, the projection of the density field onto a basis set leads to an additional error on top of that which can be ascribed to machine learning. In practice, all the efforts placed into achieving a negligible machine-learning error are futile if the overall accuracy of the model is dictated by a large basis set decomposition error.

Standard quantum chemical basis sets are generally optimized to closely reproduce the behavior of atomic orbitals75 and results in unacceptable errors if used to decompose the electronic density (Fig. 1). In contrast, specialized basis sets used in the density fitting approximation (also known as resolution-of-the-identity (RI) approximation)76–82 are specifically optimized to represent a linear expansion of one-electron charge densities obtained from the product of atomic orbitals. Using the RI-auxiliary basis sets {ϕRIk}, the total electron density field can be expressed as:

2. 1

where, Dab is the one-electron reduced density matrix and dabk are the RI-expansion coefficients. Given a molecular geometry, the value of the basis functions can be readily computed at each point of space, leaving the ck expansion coefficients as the only ingredient needed by the machine-learning model to fully determine ρ(r) (more details in the ESI).

Fig. 1. (left) Decomposition error of the electron density of a single water molecule: evolution of the absolute percentage error depending on the choice of decomposition basis set. (right) Comparison of the density error made with the standard and the RI-auxiliary cc-pVQZ basis set (cyan and orange isosurfaces refer to an error of ±0.005 bohr−3). Reference density: PBE/cc-pVQZ.

Fig. 1

As shown in Fig. 1, the use of the RI-auxiliary basis sets results in nearly two orders of magnitude increase in the overall accuracy with respect to the corresponding standard basis set. The addition of diffuse functions marginally improves the performance of the decomposition, but leads to instabilities of the overlap matrix (high condition number) and increases dramatically the number of basis functions per atom.

In practice, Weigend's cc-pVQZ/JKFIT81 basis set (henceforth: cc-pVQZ-RI) offers the best trade-off between accuracy and computational demand and therefore represents the best choice for the density decomposition.

2.1. Computational details

The dataset of molecular dimers has been selected from the side-chain side-chain interaction (SSI) subset of the BioFragment database (BFDb).73 The original set is made of 3380 dimers formed by amino-acids side-chain fragments taken from 47 different protein structures. Dimers with more than 25 atoms as well as those containing sulfur atoms were not considered. While the total number of sulfur-containing structures is too small to enable the machine-learning model to accurately capture its rich chemistry, the inclusion of the larger systems does not increase dramatically the chemical diversity of the dataset. The final dataset contains a total of 2291 dimers.

As shown in Fig. 2, the complete set of 2291 dimers spans a large variety of dominant interaction types, ranging from purely dispersion dominated complexes (in blue) to mixed-influence (green and yellow) to hydrogen-bonded and charged systems (red). We retain the same classification criteria as in the original database to attribute the nature of the dominant interaction.

Fig. 2. Ternary diagram representation of the attractive components of the dimer interaction energies for the 2291 systems considered in this work. The values of the SAPT analysis are taken from ref. 73.

Fig. 2

For each dimer, the reference full-electron density has been computed at the ωB97X-D/cc-pVQZ level using the resolution of identity approximation for the Coulomb and exchange potential (RI-JK). This implies that RI-auxiliary functions up to l = 5 are included for carbon, nitrogen and oxygen atoms while auxiliary functions up to l = 4 are used for hydrogen atoms.

3. Results and discussion

The training set for the density-learning model was chosen by randomly picking 2000 dimers out of a total of 2291 possibilities. The remaining 291 were used to test the accuracy of the predictions. Given the tremendous number of possible atomic environments (∼40 000) associated with such a chemically diverse database, a subset of M reference environments was selected to reduce the dimensionality of the regression problem (see ESI). To assess the consequences of this dimensionality reduction, the learning exercise was performed on three different sizes M = {100, 500, 1000} for the reference atomic environments. Fig. 3 summarizes the performance of the machine learning algorithm, expressed in terms of the mean absolute difference between the predicted and the reference densities (QM). Here, only the machine-learning error is shown as the reference densities derive from the RI-expansion of the computed ab initio densities. Since the test set contains molecules of different sizes, the contribution of each dimer has been weighted considering the ratio between its number of electrons and the total number of electrons in the test set.

3. 2

where the sum is performed over the 291 dimers of the test set, Ne is the total number of electrons, Nie is the number of electrons in a dimer, ρiQM(r) and ρiML(r) are, respectively, the ab initio and the predicted density amplitudes at a point. Both integrals of eqn (2) are evaluated in real-space over a cubic grid with step size of 0.1 bohr in all direction and at least 6 Å between any atom and the cube border.

Fig. 3. Learning curves with respect to RI-expanded densities (ML error). (left) weighted mean absolute percentage error (ερ (%)) of the predicted SA-GPR densities as a function of the number of training dimers. The weights correspond to the number of electrons in each dimer and the normalization is defined by the total number of electrons. Color code reflects the number of reference environments. (right) ερ (%) of the predicted SA-GPR densities (M = 1000) divided per dominant contribution to the interaction energy according to ref. 73.

Fig. 3

As shown in the first panel of Fig. 3, 100 training dimers were sufficient to reach saturation of the density error around 0.5% for M = 100. This result already outperforms the level of accuracy reached in our previous work, which is remarkable given the large chemical diversity of the dataset and the consideration of all-electron densities. Learning curves obtained with M = 500 and M = 1000 show steeper slopes, approaching saturation at about 2000 training dimers with errors that were reduced to ∼0.2–0.3%. The predicted full-electron densities are five times more accurate than the previous predictions of valence-only densities (approximately 1%).58 A more detailed analysis of the M = 1000 learning curve reveals a strong dependence on the nature of the dominant interaction (Fig. 3). Specifically, a stronger non-local character in the interaction yields a larger error. This is especially prevalent for dimers dominated by electrostatic interactions (i.e., hydrogen bonds, charged systems), which are characterized by errors that are twice as large as those found in other regimes.

The origin of this slow convergence arises from two factors. First, only about 20% of the dimers are dominantly bound by electrostatics.73 The priority of the regression model is thus to minimize the error on the other classes. Second, there is a fundamental dichotomy between the local nature of our symmetry-adapted learning scheme and the long-range nature of the interactions. In fact, the electron density encodes information about the whole chemical system at once, while the machine-learning model represents molecules as a collection of 4 Å wide atom-centered environments. This difference in the spatial reach of the information encoded in the target and in the representation is a limitation. In this respect, a global molecular representation, which includes the whole chemical system, would be more suitable, but this would imply renouncing to the scalability and transferability of the model. Given a large enough training set, however, our SA-GPR model is able to capture the density deformations due to the field generated by the neighboring molecule. The reason is rooted in the intrinsic locality of density deformations and in the concept of “nearsightedness”83,84 of all local electronic properties, which constitutes a theoretical justification for a local decomposition of such quantities.

The fundamental advantage of setting the electron density as the machine-learning target is the broad spectrum of chemical properties that are directly derivable from ρ(r). For instance, the predicted charge densities are the key ingredient in density-dependent scalar fields aimed at visualizing and characterizing interactions between atoms and molecules in real space. Examples of the density overlap region indicator (DORI)30 are given in Fig. 4 for representative dimers. Compared to the rather featureless ρ(r), DORI reveals fine details of the electronic structure, which constitute a more sensitive probe for the quality of the machine-learning predictions. In particular, it reveals density overlaps (or clashes) associated with bonding and non-covalent regions on equal footing through the behavior of the local wave-vector (∇ρ(r)/ρ(r)).85–87

Fig. 4. DORI maps of representative dimers for each type of dominant interaction (DORI isovalue: 0.9). Isosurfaces are color-coded31 with sgn(λ2)ρ(r) in the range from attractive −0.02 a.u. (red) to repulsive 0.02 a.u. (blue). In particular, sgn(λ2)ρ(r) < 0 characterizes covalent bonds or strongly attractive NCIs (e.g. H-bonds); sgn(λ2)ρ(r) ∼ 0 indicates weak attractive interactions (van der Waals); sgn(λ2)ρ(r) > 0 repulsive NCIs (e.g. steric clashes).

Fig. 4

As shown in Fig. 4, the intra- and intermolecular DORI domains obtained with the SA-GPR densities are indistinguishable from those in the ab initio maps. This performance is especially impressive for the density clashes associated with low-density values, as is typical for the non-covalent domains. All the features are well captured by the predicted densities ranging from large and delocalized basins typical of the van der Waals complexes (in green) to the compact and directional domains typical of electrostatic interactions to intramolecular steric clashes (e.g. phenol, mixed regime). A quantitative measure of the DORI accuracy for the most characteristic basin of each type of interaction is reported in the ESI. Overall, these results illustrate that the residual 0.2% mean absolute percentage error does not significantly affect the density amplitude in the valence and intermolecular regions that are accurately described by the SA-GPR model. The highest amplitude errors are concentrated near the nuclei in the region dominated by the core-density fluctuations.

The versatility of the machine-learning prediction is further illustrated by using the predicted densities to compute the molecular electrostatic potential (ESP) for the same representative dimers (Fig. 5). ESP maps based on predicted densities agree quantitatively with the ab initio reference and correctly attribute the sign and magnitude of the electrostatic potential in all regions of space. Importantly, the accuracy of the ESP magnitude remains largely independent of the dominant interaction type. This is especially relevant for charged dimers (electrostatics) as it demonstrates that despite slower convergence of the learning curve for this category, the achieved accuracy of the model is sufficient to describe the key features of the electrostatic potential.

Fig. 5. Electrostatic potential (ESP) maps of representative dimers for each type of dominant interaction (density isovalue: 0.05 e bohr−3). ESP potential is given in Hartree atomic units (a.u.).

Fig. 5

The most widespread applications of ESP maps exploit qualitative information (e.g., identification of the molecular regions most prone to electrophilic/nucleophilic attack) but the electrostatic potentials can be related to quantitative properties such as the degree of acidity of hydrogen bonds and the magnitude of binding energies.88–92 As a concrete example related to structure-based drug design, we used a recent model that estimates the strength of the stacking interactions between heterocycles and aromatic amino acid side-chains directly from the ESP maps.88,91,93 This model derives the stacking energies of drug-like heterocycles from the maximum and mean value of their ESP within a surface delimited by molecular van der Waals volume (at 3.25 Å above the molecular plane).88 Following this procedure, we used the ESP derived from the ML predicted densities to compute the binding energies between a representative heterocycle included in our dataset, the tryptophan side-chain, and the three aromatic amino acid side-chains (Fig. 6).

Fig. 6. (left) Electrostatic potential maps 3.25 Å above the plane of the tryptophan (TRP) side-chain. The van der Waals volume of TRP is represented in transparency. The color code represents the electrostatic potential in kcal mol−1 according the scale chosen in ref. 88. (Right) Stacking interaction energies of TRP with the phenylalanine (PHE), tyrosin (TYR) and tryptophan (TRP) side-chains computed as detailed in ref. 88 on the basis of ab initio (top) and ML-predicted (bottom) ESP.

Fig. 6

Comparison between ab initio and ML predicted stacking interaction energies shows that the deviations in the ESP maps lead to minor errors on the order of 0.05 kcal mol−1. The largest deviations in the ESP would appear further away from the molecule, beyond the region exploited for the computation of the energy descriptors (i.e., the sum of the atomic van der Waals radii). The predicted ESP shows larger relative deviations far from the nuclei owing to the error propagation of the density predictions ρ(r) to the electrostatic potential ϕ(r). This can be best understood in the reciprocal space, where the deviations of the potential at a given wave-vector k are related to the density error by δ Created by potrace 1.16, written by Peter Selinger 2001-2019 (k) = 4πδ Created by potrace 1.16, written by Peter Selinger 2001-2019 (k)/k2. Because of the k−2 scaling, the error on ϕ(k) increases as k → 0, implying that larger relative errors of the electrostatic potential are expected in regions of space where ϕ(r) is slowly varying (i.e., thus determined by the long-wavelength components).

3.1. Prediction on polypeptides

The tremendous advantage of the atom-centered density decomposition is to deliver a machine-learning model that depends only on the different atomic environments and not on the identity of the molecules included in the training set. Thanks to its transferability, the model provides access to density information of large macromolecules, at the sole price of including sufficient diversity, that can capture the chemical complexity of a larger system. The predictive power of this extrapolation procedure is demonstrated by using the machine-learning model exclusively trained on the 2291 BFDb dimers to predict the electron density of 8 polypeptides taken from the Protein DataBank (PDB).94 The performance of the ML model for each macromolecule, labeled by their PBD ID, is reported in Fig. 8.

Fig. 8. Weighted mean absolute percentage error (ερ (%)) with respect to ωB97X-D/cc-pVQZ densities of the predicted densities extrapolated for 8 biologically relevant peptides (protein databank ID).

Fig. 8

Overall, the predictions lead to a low average error of only 1.5% for the 8 polypeptides, which is in line with the highest density errors obtained on the BFDb test set. Relevantly, the largest discrepancies are obtained for 3WNE, which is the only cyclopeptide of the set. The origin of these differences can be understood by performing a more detailed analysis of a representative polypeptide, the leu-enkephalin (4OLR). The errors in this percentage range do not affect the density-based properties, such as the spatial analysis of the non-covalent interactions with scalar fields (Fig. 7 top right panel). Yet, the density differences indicate that the highest absolute errors occur along the amino acid backbone (Fig. 7 lower panels). In addition, the analysis of the relative error with the Walker–Mezey L(a,a′) index34 shows the highest similarity at the core (99.3%), slowly decreasing while approaching the non-covalent domain (96.3%) (Fig. 7 top left panel). The L(a,a′) index complements the density difference information by showing that the actual density amplitudes and the prediction error do not decrease at the same rate. Nevertheless, the loss of relative accuracy remains modest and the quality of the density is mainly governed by the predictions along the peptide backbone, which are especially sensitive for the more strained 3WNE cyclopeptide. Although similar chemical environments were included in the training set, the error is mainly determined by the lack of an explicit peptide bond motif and cyclopeptides in the training set. While this limitation could be addressed by ad hoc modification of the training set, the overall performance of the machine-learning model is rather exceptional as it provides in only a few minutes, instead of almost a day (about 500 times faster for e.g. enkephalin with the same functional and basis set), electron densities of DFT quality for large and complex molecular systems. For comparison, the superposition of atomic densities (i.e., the promolecular approach), which has been used to qualitatively analyze non-covalent interactions in peptides and proteins (e.g.ref. 32) lead to much larger mean absolute percentage errors (17 times higher, see Fig. S1 in the ESI).

Fig. 7. (top left) predicted electron density of enkephalin (PBD ID: 4OLR) at three isovalues: 0.5, 0.1, and 0.001 e bohr−3. For each isosurface, the L(a,a′) similarity index with respect to ab initio density is reported. (top right) DORI map of enkephalin (DORI isovalue: 0.9) colored by sgn(λ2)ρ(r) in the range from −0.02 a.u. (red) to 0.02 a.u. (blue) (lower left) density difference between predicted and ab initio electron density (isovalues ± 0.01e bohr−3). (lower right) density difference between predicted and ab initio electron density of 3WNE (isovalues ± 0.01e bohr−3).

Fig. 7

4. Conclusion

Given its central role in electronic structure methods, the total electron density is a very promising target for machine learning, since accurate predictions of ρ(r) give access to all the information needed to characterize a chemical system. Among the many possible properties that can be computed from the electron density, the patterns arising from non-covalent interactions constitute a particular challenge for machine learning models owing to their long-range nature and subtle physical origin. An effective ML model should be transferable across different systems, efficient in learning from relatively small training sets, and accurate in predicting ρ(r) both in the quickly-varying region around the atomic nuclei, in the tail and – crucially for the study of non-covalent interactions – in those regions that are characterized by low densities and low density gradients. In this work, we have presented a model that fulfills all of these requirements, based on an atom-centered decomposition of the density with a quadruple-zeta resolution-of-identity basis set, a symmetry-adapted Gaussian Process regression ML scheme, and training on a diverse database of 2000 sidechain–sidechain dimers extracted from the BioFragment database.

The model reaches a 0.3% accuracy on a validation set, that is sufficient to investigate density-based fingerprints of NCIs, and to evaluate the electrostatic potential with sufficient accuracy to quantitatively estimate residue–residue interactions. The transferability of the model is demonstrated by predicting, at a cost that is orders of magnitude smaller than by explicit electronic structure calculations, the electron density for a demonstrative set of oligopeptides, with an accuracy sufficient to reliably visualize bonding patterns and non-covalent domains using the DORI scalar field. Even though the model reaches an impressive accuracy (0.5% mean absolute percentage error) for dimers that are predominantly bound by electrostatic interactions, the comparatively larger error suggests that future work should focus on resolving the dichotomy between the local machine learning framework and the long-range nature of the intermolecular interactions.

Conflicts of interest

There are no conflicts to declare.

Supplementary Material

SC-010-C9SC02696G-s001

Acknowledgments

The National Centre of Competence in Research (NCCR) “Materials' Revolution: Computational Design and Discovery of Novel Materials (MARVEL)” of the Swiss National Science Foundation (SNSF) and the EPFL are acknowledged for financial support.

Electronic supplementary information (ESI) available. See DOI: 10.1039/c9sc02696g

Notes and references

  1. Stone A., The Theory of Intermolecular Forces, Oxford University Press, 2013 [Google Scholar]
  2. Buckingham A. D. Fowler P. W. Hutson J. M. Chem. Rev. 1988;88:963–988. doi: 10.1021/cr00088a008. [DOI] [Google Scholar]
  3. Castleman Jr A. Hobza P. Chem. Rev. 1994;94:1721–1722. doi: 10.1021/cr00031a600. [DOI] [Google Scholar]
  4. Brutschy B. Hobza P. Chem. Rev. 2000;100:3861–3862. doi: 10.1021/cr990074x. [DOI] [PubMed] [Google Scholar]
  5. Hobza P. Řezáč J. Chem. Rev. 2016;116:4911–4912. doi: 10.1021/acs.chemrev.6b00247. [DOI] [PubMed] [Google Scholar]
  6. Pastorczak E. Corminboeuf C. J. Chem. Phys. 2017;146:120901. doi: 10.1063/1.4978951. [DOI] [PubMed] [Google Scholar]
  7. Parr R. and Weitao Y., Density-Functional Theory of Atoms and Molecules, Oxford University Press, 1994 [Google Scholar]
  8. Kohn W. Sham L. J. Phys. Rev. 1965;140:A1133–A1138. doi: 10.1103/PhysRev.140.A1133. [DOI] [Google Scholar]
  9. Cohen A. J. Mori-Sánchez P. Yang W. Chem. Rev. 2012;112:289–320. doi: 10.1021/cr200107z. [DOI] [PubMed] [Google Scholar]
  10. Becke A. D. J. Chem. Phys. 2014;140:18A301. doi: 10.1063/1.4869598. [DOI] [PubMed] [Google Scholar]
  11. Mardirossian N. Head-Gordon M. Mol. Phys. 2017;115:2315–2372. doi: 10.1080/00268976.2017.1333644. [DOI] [Google Scholar]
  12. Porezag D. Pederson M. R. Phys. Rev. B: Condens. Matter Mater. Phys. 1996;54:7830–7836. doi: 10.1103/PhysRevB.54.7830. [DOI] [PubMed] [Google Scholar]
  13. Gilson M. K. Honig B. H. Nature. 1987;330:84–86. doi: 10.1038/330084a0. [DOI] [PubMed] [Google Scholar]
  14. Mecozzi S. West A. P. Dougherty D. A. Proc. Natl. Acad. Sci. U. S. A. 1996;93:10566–10571. doi: 10.1073/pnas.93.20.10566. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Sagara T. Klassen J. Ganz E. J. Chem. Phys. 2004;121:12543. doi: 10.1063/1.1809608. [DOI] [PubMed] [Google Scholar]
  16. Cardamone S. Hughes T. J. Popelier P. L. A. Phys. Chem. Chem. Phys. 2014;16:10367. doi: 10.1039/C3CP54829E. [DOI] [PubMed] [Google Scholar]
  17. Polavarapu P. L. J. Phys. Chem. 1990;94:8106–8112. doi: 10.1021/j100384a024. [DOI] [Google Scholar]
  18. Hughes J. L. P. Sipe J. E. Phys. Rev. B: Condens. Matter Mater. Phys. 1996;53:10751–10763. doi: 10.1103/PhysRevB.53.10751. [DOI] [PubMed] [Google Scholar]
  19. Sipe J. E. Shkrebtii A. I. Phys. Rev. B: Condens. Matter Mater. Phys. 2000;61:5337–5352. doi: 10.1103/PhysRevB.61.5337. [DOI] [Google Scholar]
  20. Sharma S. Ambrosch-Draxl C. Phys. Scr., T. 2004;109:128. doi: 10.1238/Physica.Topical.109a00128. [DOI] [Google Scholar]
  21. Masunov A. E. Tannu A. Dyakov A. A. Matveeva A. D. Freidzon A. Y. Odinokov A. V. Bagaturyants A. A. J. Chem. Phys. 2017;146:244104. doi: 10.1063/1.4986793. [DOI] [PubMed] [Google Scholar]
  22. Koritsanszky T. S. Coppens P. Chem. Rev. 2001;101:1583–1628. doi: 10.1021/cr990112c. [DOI] [PubMed] [Google Scholar]
  23. Lecomte C. Guillot B. Muzet N. Pichon-Pesme V. Jelsch C. Cell. Mol. Life Sci. 2004;61:774–782. doi: 10.1007/s00018-003-3405-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Jayatilaka D. Dittrich B. Acta Crystallogr., Sect. A: Found. Crystallogr. 2008;64:383–393. doi: 10.1107/S0108767308005709. [DOI] [PubMed] [Google Scholar]
  25. Schnieders M. J. Fenn T. D. Pande V. S. Brunger A. T. Acta Crystallogr., Sect. D: Biol. Crystallogr. 2009;65:952–965. doi: 10.1107/S0907444909022707. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Brunger A. and Adams P., Comprehensive Biophysics, Elsevier, 2012, pp. 105–115 [Google Scholar]
  27. Gatti C. and Macchi P., Modern Charge-Density Analysis, Springer Netherlands, Dordrecht, 2012 [Google Scholar]
  28. Bader R. F. W. Chem. Rev. 1991;91:893–928. doi: 10.1021/cr00005a013. [DOI] [Google Scholar]
  29. Bader R., The Quantum Theory of Atoms in Molecules, Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim, Germany, 2007 [Google Scholar]
  30. de Silva P. Corminboeuf C. J. Chem. Theory Comput. 2014;10:3745–3756. doi: 10.1021/ct500490b. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Johnson E. R. Keinan S. Mori-Sánchez P. Contreras-García J. Cohen A. J. Yang W. J. Am. Chem. Soc. 2010;132:6498–6506. doi: 10.1021/ja100936w. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Contreras-García J. Johnson E. R. Keinan S. Chaudret R. Piquemal J.-P. Beratan D. N. Yang W. J. Chem. Theory Comput. 2011;7:625–632. doi: 10.1021/ct100641a. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Walker P. D. Mezey P. G. J. Am. Ceram. Soc. 1993;115:12423–12430. [Google Scholar]
  34. Walker P. D. Mezey P. G. J. Am. Chem. Soc. 1994;116:12022–12032. doi: 10.1021/ja00105a050. [DOI] [Google Scholar]
  35. Exner T. E. Mezey P. G. J. Phys. Chem. A. 2002;106:11791–11800. doi: 10.1021/jp0263166. [DOI] [Google Scholar]
  36. Exner T. E. Mezey P. G. J. Comput. Chem. 2003;24:1980–1986. doi: 10.1002/jcc.10340. [DOI] [PubMed] [Google Scholar]
  37. Szekeres Z. Exner T. Mezey P. G. Int. J. Quantum Chem. 2005;104:847–860. doi: 10.1002/qua.20616. [DOI] [Google Scholar]
  38. Stoll H. Wagenblast G. Preuβ H. Theor. Chim. Acta. 1980;57:169–178. doi: 10.1007/BF00574903. [DOI] [Google Scholar]
  39. Meyer B. Guillot B. Ruiz-Lopez M. F. Genoni A. J. Chem. Theory Comput. 2016;12:1052–1067. doi: 10.1021/acs.jctc.5b01007. [DOI] [PubMed] [Google Scholar]
  40. Meyer B. Guillot B. Ruiz-Lopez M. F. Jelsch C. Genoni A. J. Chem. Theory Comput. 2016;12:1068–1081. doi: 10.1021/acs.jctc.5b01008. [DOI] [PubMed] [Google Scholar]
  41. Meyer B. Genoni A. J. Phys. Chem. A. 2018;122:8965–8981. doi: 10.1021/acs.jpca.8b09056. [DOI] [PubMed] [Google Scholar]
  42. Hirshfeld F. L. Acta Crystallogr., Sect. B: Struct. Crystallogr. Cryst. Chem. 1971;27:769–781. doi: 10.1107/S0567740871002905. [DOI] [Google Scholar]
  43. Stewart R. F. Acta Crystallogr., Sect. A: Found. Crystallogr. 1976;32:565–574. doi: 10.1107/S056773947600123X. [DOI] [Google Scholar]
  44. Hansen N. K. Coppens P. Acta Crystallogr., Sect. A: Found. Crystallogr. 1978;34:909–921. doi: 10.1107/S0567739478001886. [DOI] [Google Scholar]
  45. Pichon-Pesme V. Lecomte C. Lachekar H. J. Phys. Chem. 1995;99:6242–6250. doi: 10.1021/j100016a071. [DOI] [Google Scholar]
  46. Jelsch C. Pichon-Pesme V. Lecomte C. Aubry A. Acta Crystallogr., Sect. D: Biol. Crystallogr. 1998;54:1306–1318. doi: 10.1107/S0907444998004466. [DOI] [PubMed] [Google Scholar]
  47. Zarychta B. Pichon-Pesme V. Guillot B. Lecomte C. Jelsch C. Acta Crystallogr., Sect. A: Found. Crystallogr. 2007;63:108–125. doi: 10.1107/S0108767306053748. [DOI] [PubMed] [Google Scholar]
  48. Lecomte C. Jelsch C. Guillot B. Fournier B. Lagoutte A. J. Synchrotron Radiat. 2008;15:202–203. doi: 10.1107/S0909049508000447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Domagala S. Munshi P. Ahmed M. Guillot B. Jelsch C. Acta Crystallogr., Sect. B: Struct. Crystallogr. Cryst. Chem. 2011;67:63–78. doi: 10.1107/S0108768110041996. [DOI] [PubMed] [Google Scholar]
  50. Domagala S. Fournier B. Liebschner D. Guillot B. Jelsch C. Acta Crystallogr., Sect. A: Found. Crystallogr. 2012;68:337–351. doi: 10.1107/S0108767312008197. [DOI] [PubMed] [Google Scholar]
  51. Koritsanszky T. Volkov A. Coppens P. Acta Crystallogr., Sect. A: Found. Crystallogr. 2002;58:464–472. doi: 10.1107/S0108767302010991. [DOI] [PubMed] [Google Scholar]
  52. Dominiak P. M. Volkov A. Li X. Messerschmidt M. Coppens P. J. Chem. Theory Comput. 2007;3:232–247. doi: 10.1021/ct6001994. [DOI] [PubMed] [Google Scholar]
  53. Dittrich B. Koritsánszky T. Luger P. Angew. Chem., Int. Ed. 2004;43:2718–2721. doi: 10.1002/anie.200353596. [DOI] [PubMed] [Google Scholar]
  54. Hathwar V. R. Thakur T. S. Row T. N. G. Desiraju G. R. Cryst. Growth Des. 2011;11:616–623. doi: 10.1021/cg101540y. [DOI] [Google Scholar]
  55. Brockherde F. Vogt L. Li L. Tuckerman M. E. Burke K. Müller K.-R. Nat. Commun. 2017;8:872. doi: 10.1038/s41467-017-00839-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Bogojeski M., Brockherde F., Vogt-Maranto L., Li L., Tuckerman M. E., Burke K. and Müller K.-R., arXiv:1811.06255, 2018
  57. Grisafi A. Wilkins D. M. Csányi G. Ceriotti M. Phys. Rev. Lett. 2018;120:036002. doi: 10.1103/PhysRevLett.120.036002. [DOI] [PubMed] [Google Scholar]
  58. Grisafi A. Fabrizio A. Meyer B. Wilkins D. M. Corminboeuf C. Ceriotti M. ACS Cent. Sci. 2019;5:57–64. doi: 10.1021/acscentsci.8b00551. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Alred J. M. Bets K. V. Xie Y. Yakobson B. I. Compos. Sci. Technol. 2018;166:3–9. doi: 10.1016/j.compscitech.2018.03.035. [DOI] [Google Scholar]
  60. Chandrasekaran A. Kamal D. Batra R. Kim C. Chen L. Ramprasad R. npj Comput. Mater. 2019;5:22. doi: 10.1038/s41524-019-0162-7. [DOI] [Google Scholar]
  61. Fowler A. T. Pickard C. J. Elliott J. A. Journal of Physics: Materials. 2019;2:034001. [Google Scholar]
  62. Jurečka P. Šponer J. Černý J. Hobza P. Phys. Chem. Chem. Phys. 2006;8:1985–1993. doi: 10.1039/B600027D. [DOI] [PubMed] [Google Scholar]
  63. Zhao Y. Truhlar D. G. Acc. Chem. Res. 2008;41:157–167. doi: 10.1021/ar700111a. [DOI] [PubMed] [Google Scholar]
  64. Řezáč J. Riley K. E. Hobza P. J. Chem. Theory Comput. 2011;7:2427–2438. doi: 10.1021/ct2002946. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Burns L. A. Vázquez-Mayagoitia Á. Sumpter B. G. Sherrill C. D. J. Chem. Phys. 2011;134:084107. doi: 10.1063/1.3545971. [DOI] [PubMed] [Google Scholar]
  66. Marshall M. S. Burns L. A. Sherrill C. D. J. Chem. Phys. 2011;135:194102. doi: 10.1063/1.3659142. [DOI] [PubMed] [Google Scholar]
  67. Smith D. G. A. Burns L. A. Patkowski K. Sherrill C. D. J. Phys. Chem. Lett. 2016;7:2197–2203. doi: 10.1021/acs.jpclett.6b00780. [DOI] [PubMed] [Google Scholar]
  68. Grimme S. Chem.–Eur. J. 2012;18:9955–9964. doi: 10.1002/chem.201200497. [DOI] [PubMed] [Google Scholar]
  69. Risthaus T. Grimme S. J. Chem. Theory Comput. 2013;9:1580–1591. doi: 10.1021/ct301081n. [DOI] [PubMed] [Google Scholar]
  70. Schneebeli S. T. Bochevarov A. D. Friesner R. A. J. Chem. Theory Comput. 2011;7:658–668. doi: 10.1021/ct100651f. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Mardirossian N. Head-Gordon M. J. Chem. Phys. 2016;144:214110. doi: 10.1063/1.4952647. [DOI] [PubMed] [Google Scholar]
  72. McGibbon R. T. Taube A. G. Donchev A. G. Siva K. Hernández F. Hargus C. Law K.-H. Klepeis J. L. Shaw D. E. J. Chem. Phys. 2017;147:161725. doi: 10.1063/1.4986081. [DOI] [PubMed] [Google Scholar]
  73. Burns L. A. Faver J. C. Zheng Z. Marshall M. S. Smith D. G. Vanommeslaeghe K. MacKerell A. D. Merz K. M. Sherrill C. D. J. Chem. Phys. 2017;147:161727. doi: 10.1063/1.5001028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Grisafi A., Wilkins D. M., Willatt M. J. and Ceriotti M., arXiv:1904.01623, 2019
  75. Helgaker T., Jørgensen P. and Olsen J., Molecular Electronic-Structure Theory, John Wiley & Sons, Ltd, Chichester, UK, 2000 [Google Scholar]
  76. Whitten J. L. J. Chem. Phys. 1973;58:4496–4501. doi: 10.1063/1.1679012. [DOI] [Google Scholar]
  77. Dunlap B. I. Connolly J. W. D. Sabin J. R. Int. J. Quantum Chem., Symp. 1977;11:81–87. doi: 10.1002/qua.560110108. [DOI] [Google Scholar]
  78. Feyereisen M. Fitzgerald G. Komornicki A. Chem. Phys. Lett. 1993;208:359–363. doi: 10.1016/0009-2614(93)87156-W. [DOI] [Google Scholar]
  79. Rendell A. P. Lee T. J. J. Chem. Phys. 1994;101:400–408. doi: 10.1063/1.468148. [DOI] [Google Scholar]
  80. Eichkorn K. Treutler O. Öhm H. Häser M. Ahlrichs R. Chem. Phys. Lett. 1995;240:283–290. doi: 10.1016/0009-2614(95)00621-A. [DOI] [Google Scholar]
  81. Weigend F. Phys. Chem. Chem. Phys. 2002;4:4285–4291. doi: 10.1039/B204199P. [DOI] [Google Scholar]
  82. Werner H.-J. Manby F. R. Knowles P. J. J. Chem. Phys. 2003;118:8149–8160. doi: 10.1063/1.1564816. [DOI] [Google Scholar]
  83. Kohn W. Phys. Rev. Lett. 1996;76:3168–3171. doi: 10.1103/PhysRevLett.76.3168. [DOI] [PubMed] [Google Scholar]
  84. Prodan E. Kohn W. Proc. Natl. Acad. Sci. U. S. A. 2005;102:11635–11638. doi: 10.1073/pnas.0505436102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  85. Nagy A. March N. H. Mol. Phys. 1997;90:271–276. doi: 10.1080/002689797172750. [DOI] [Google Scholar]
  86. Bohórquez H. J. Boyd R. J. J. Chem. Phys. 2008;129:024110. doi: 10.1063/1.2953698. [DOI] [PubMed] [Google Scholar]
  87. Nagy Á. Liu S. Phys. Lett. A. 2008;372:1654–1656. doi: 10.1016/j.physleta.2007.10.055. [DOI] [Google Scholar]
  88. Bootsma A. N., Doney A. C. and Wheeler S., chemrxiv.7628939.v4, 2019
  89. Murray J. S. Brinck T. Lane P. Paulsen K. Politzer P. J. Mol. Struct. 1994;307:55–64. doi: 10.1016/0166-1280(94)80117-7. [DOI] [Google Scholar]
  90. Murray J. S. Politzer P. J. Mol. Struct. 1998;425:107–114. doi: 10.1016/S0166-1280(97)00162-0. [DOI] [Google Scholar]
  91. Bootsma A. N. and Wheeler S., chemrxiv.8079890.v1, 2019
  92. Volkov A. Koritsanszky T. Coppens P. Chem. Phys. Lett. 2004;391:170–175. doi: 10.1016/j.cplett.2004.04.097. [DOI] [Google Scholar]
  93. Bootsma A. N. Wheeler S. E. J. Chem. Inf. Model. 2019;59:149–158. doi: 10.1021/acs.jcim.8b00563. [DOI] [PubMed] [Google Scholar]
  94. Berman H. M. Nucleic Acids Res. 2000;28:235–242. doi: 10.1093/nar/28.1.235. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SC-010-C9SC02696G-s001

Articles from Chemical Science are provided here courtesy of Royal Society of Chemistry

RESOURCES