Abstract

It is tenable to argue that nobody can predict the future with certainty, yet one can learn from the past and make informed projections for the years ahead. In this Perspective, we overview the status of how theory and computation can be exploited to obtain chemical understanding from wave function theory and density functional theory, and then outlook the likely impact of machine learning (ML) and quantum computers (QC) to appreciate traditional chemical concepts in decades to come. It is maintained that the development and maturation of ML and QC methods in theoretical and computational chemistry represent two paradigm shifts about how the Schrödinger equation can be solved. New chemical understanding can be harnessed in these two new paradigms by making respective use of ML features and QC qubits. Before that happens, however, we still have hurdles to face and obstacles to overcome in both ML and QC arenas. Possible pathways to tackle these challenges are proposed. We anticipate that hierarchical modeling, in contrast to multiscale modeling, will emerge and thrive, becoming the workhorse of in silico simulations in the next few decades.
Keywords: chemical concept, machine learning, quantum computer, wave function theory, density functional theory, multiscale modeling, hierarchical modeling
I. Introduction
Theoretical and computational chemistry employs physics methodologies to simulate properties of chemical systems. It started from the application of quantum mechanics in the early 20th century to appreciate the behavior of atoms and molecules. The introduction of digital computers in the late 1950s revolutionized the numerical solution of the Schrödinger equation, making it possible to apply wave function theory (WFT)1,2 to polyatomic molecules. In the late 1980s, density functional theory (DFT)3,4 emerged as a rigorous yet efficient tool by bypassing solving the Schrödinger equation directly. Later, incorporating classical mechanics with quantum mechanics empowered multiscale modeling,5,6 which has become state-of-the-art, enabling us to simulate complex systems such as enzymes and macromolecular processes. Meanwhile, applying WFT and DFT to achieve better understanding for traditional chemical concepts has been continuously pursued and fruitfully accomplished in terms of, e.g., FMO (frontier molecular orbital) theory7,8 and CDFT (conceptual DFT).3,9−12 It is generally accepted that theoretical and computational chemistry has nowadays become a mature chemical discipline that enjoys widespread applications across pharmaceutical, materials, and biological sciences. Nevertheless, to tackle the pressing challenges facing humankind in coming decades in health, energy, environment, etc., which are often complex systems involving multiple components working together, we still have a long way to go.
In the recent theoretical and computational chemistry literature,13−19 we have witnessed a gigantic growth of applications of artificial intelligence, machine learning (ML), and deep learning (hereafter, we do not distinguish these terminologies from each other and generally refer to them as ML). We also started noticing booming theoretical and computational chemistry publications using quantum computers (QC).20−26 These newly developed methodologies are fascinating, and their impacts could be far-reaching. However, general views in the theoretical and computational chemistry community about the impact of ML and QC are diversified and sometimes controversial. While optimists are constantly looking for more applications to fundamentally overhaul the field, pessimists hold negative views about their impact, if any, on theoretical and computational chemistry. The key difference is whether ML and QC are merely offering better tools to enhance efficiency and accuracy in computation, or they provide turf-breaking opportunities to revolutionize the territory of theoretical chemistry. In this contribution, we argue the future might prove that the latter is the case. Our basic premise is that there could be multiple approaches to numerically solve the Schrödinger equation and WFT, DFT, ML, and QC are four alternative yet viable examples of such approaches. Also, based on our past experiences in exploiting chemical understanding with theory and computation, we discuss the possible impact on how chemical understanding can be harvested with these new tools in decades to come. Before getting started, we should make the following two points clear. First, exhaustive citations can never be possible, so we apologize if we miss any relevant publications. Also, we are aware that our vision is limited by our experiences, so our projection may be overreaching and thus could be proven inappropriate or completely incorrect later. Nevertheless, if any of what we will present below provides any insight from a different perspective to our readers, that meets the precise intention of this work. The cautiously optimistic, though heuristic per se, views presented here represent our long-held belief that anything is possible in the future and what we do today might determine what we will end up with tomorrow.
In what follows, we will first present the challenge of simulations in the era of multiscale modeling. To set the stage for our ensuing discussion, we will add two axes, one for computation and the other for understanding. This ultimate challenge of in silico simulations is the foundation and starting point of the present discussion. We will then summarize the status of how we tackle the matter using WFT and DFT frameworks. Next, brief introductions of ML and QC are to be followed with the emphasis on how theoretical and computational chemistry may benefit from them as alternative approaches to solve the Schrödinger equation, what their limitations are, and how we can improve. After that, the general scheme of how chemical understanding can be harnessed from different frameworks will be shown in an orthogonal manner. Finally, we will conclude the discussion by envisaging that hierarchical modeling, a top-down simulation approach traversing multiple scales, will emerge and thrive, becoming more attractive than, yet complementary to, multiscale modeling.
II. The Ultimate Challenge of In Silico Simulations
Chemical science in the present times faces multiple challenges, ranging from designing advanced materials with novel functions and combating human health problems to converting solar energy and addressing sustainable growth. From the theoretical and computational chemistry viewpoint, these issues can often, if not always, be boiled down to in silico simulations and dealt with by multiscale modeling,5,6 which consists of scales along both space and time axes, as shown in Figure 1a. Depending on the space-time domain, we have microscopic, mesoscopic, and macroscopic scales. To simulate these scales, computational methodologies must be different, with the microscale employing quantum mechanics, macroscale utilizing classical mechanics, and mesoscale using hybrid approaches. These different methods form the computation axis in Figure 1b, whose outcomes are propensities in structural, thermodynamic, electronic, spectroscopic, and other properties with varied accuracy. These properties form the property space of a given system.
Figure 1.
(a) Multiscale modeling of chemical systems and (b) the ultimate challenge of simulations.
On the other hand, there is a separate understanding axis in Figure 1b. One might wonder why we need a separate understanding axis perpendicular to the computation axis. This is because computation is based on physics laws from quantum mechanics or classical mechanics, whose results obtained are total values of physical observables such as energy, force, density, etc. In chemistry and biology, however, we are interested in molecular behaviors on the potential energy surface and free energy landscape due to the change in the number of electrons or nuclear conformations, so changes in energy, force, density, etc., not their total values, are more relevant. These changes are often in very small numbers compared to the total value of the physical quantities, yet they make huge differences in understanding chemical transformations and biological processes. The patterns, effects, rules, principles, and laws governing the change of these quantities are expressed as chemical concepts, which form the foundation of conventional understanding and chemical wisdom. In most cases, these differences in physical quantities are not directly extractable from computational results because they often involve multiple systems or processes (e.g., barrier heights between reactants and products), but they may be obtained by making use of the basic variables from different theoretical frameworks. That is the reason why we had the additional axis in Figure 1b.
Historically, chemical science is an experimental discipline. Chemical understanding was obtained from experimentation and expressed by chemical concepts, such as bonding, acidity, aromaticity, steric effect, electrophilicity, regioselectivity, etc. These concepts were coined by experimental chemists through abstraction and generalization to group together objects, phenomena, and processes that share common characteristics. They form the foundation of the traditional wisdom of chemical understanding and thus are the core of chemical science. These concepts cannot be directly evaluated from the computation axis from Figure 1b, yet computational results can be employed to help improve their understanding. For example, there is no concept of bonding in quantum mechanics, but with the orbitals introduced by WFT, covalent bonding can be appreciated by orbital overlapping. In DFT, however, there is no concept of orbitals, so as our recent studies have shown, density-based descriptors can be employed to identify different kinds of covalent bonds and various categories of noncovalent interactions.27−29 Other examples are aromaticity, steric effect, electrophilicity, regioselectivity, etc. They originated from experimental studies, but different theories provide different insights to understand them. How to crank numbers along the computation axis in Figure 1b is paramount, but how to turn numbers into understanding along the understanding axis is equally important. These conjoint efforts are never easy, but not impossible. In our view, Figure 1b summarizes the ultimate challenge of in silico simulations in theoretical and computational chemistry.
III. WFT and DFT as Two Paradigms
The size of chemical space is enormous, so is the property space. The mapping between chemical space and property space can be one-to-many and many-to-one. For example, one molecule can have multiple properties (such as acidity, aromaticity, nucleophilicity, and so on), whereas several molecules might possess the same property or functionality (e.g., hydrophobicity, binding affinity, acceptor inhibition, etc.). The many-to-one mapping is often called the inverse molecular design. In quantum chemistry, the one-to-many mapping is dictated by the Schrödinger equation, whose solution must be approximate except for a few special cases. Two categories of approximations are available in the literature.1−4 The first is WFT (Figure 2a), including valence bond theory (VBT) and molecular orbital theory (MOT), and the other is DFT (Figure 2b). In Figure 2a, the Hamiltonian operator, Ĥ, represents a molecular species in chemical space, and its electronic energy E and total wave function Ψ can be numerically determined by employing orbitals {Øj}, either molecular orbitals in MOT or bond orbitals in VBT, with which all properties, Pi, associated with the species, {Pi[Øj]}, can be obtained. Insightful chemical understanding using these orbitals for traditional chemical concepts such as bonding and reactivity can also be yielded. Well-known examples to improve reactivity understanding are Fukui’s frontier molecular orbital (FMO) theory7,8 and Woodward–Hoffmann rules.30−32 Using them, chemical reactivity of numerous reactions can be qualitatively predicted.
Figure 2.

One-to-many mapping from chemical space to property space with (a) wave function theory and (b) density functional theory.
DFT provides another pathway to accomplish the one-to-many mapping, as shown in Figure 2b, by avoiding directly solving the total wave function Ψ. Instead, DFT makes use of the ground state electron density, ρ, as the basic variable. According to the basic theorems of Hohenberg–Kohn in DFT,3,33 there is a one-to-one correspondence between ρ and the external potential, υext, ρ ⇔ υext, suggesting that all properties associated with the system, including the total energy E, are functionals of ρ. DFT has been the most successful and widely applied approach in theoretical and computational chemistry in the last few decades to simulate the electronic structure of molecules and solids alike.3,4 Even though the Kohn–Sham scheme34 of DFT employed Kohn–Sham orbitals to outcome the difficulty of approximating the kinetic energy density functional, it is not necessary to do so in principle. The DFT method without using orbitals is called orbital-free DFT (OF-DFT), which has been enjoying considerable research attention in recent literature.35−37
Insightful understanding of traditional chemical concepts can also be obtained in DFT without resorting to orbitals. Conceptual DFT (CDFT)3,9−12 is the first DFT framework developed to appreciate reactivity related matters, where electronegativity, hardness, Fukui function,38,39 electrophilicity,40 dual descriptor,41 etc. were formulated. CDFT was also applied to evaluate molecular acidity42 and metal-binding specificity,43 and predict proton-coupled electron transfer (PCET) mechanisms.44,45 Also, using density associated quantities such as density gradient and Laplacian, we recently proposed several density-based descriptors to identify covalent bonds and noncovalent interactions,27−29 quantify steric effect,46 electrophilicity, and nucleophilicity,47 and determine regioselectivity and stereoselectivity.48,49 Recent mini-reviews about these studies are available.50−52 A book to highlight the recent progress of these topics in DFT, as well as in VBT and MOT, is being published.53
IV. Machine Learning as a New Paradigm
ML develops algorithms and statistical models that empower computers to perform simulations without being explicitly programmed. It does so by using supervised, unsupervised, or reinforcement learning algorithms through the features of training data sets. To build ML models, three components, data sets, features, and algorithms, are mandatory. ML features54 refer to the attributes of data sets that can be employed to train ML algorithms. ML algorithms learn patterns and establish relationships between the features and target variables to make predictions for new data sets. Even though ML does not require programming implementations as WFT and DFT methods, ML algorithms must be programmed, and the training set that ML models are trained on has to come from somewhere, usually the solutions of other programmed software in WFT and DFT.
We have observed a skyrocketing increase of ML applications in theoretical and computational chemistry in the past decade,13−19 involving all space-time domains in Figure 1a. To most people, applying ML to theoretical and computational chemistry is merely taking advantage of a new tool to expediate the simulation and improve the accuracy. This is certainly true. However, to us, it means more than just that. In our opinion, in quantum chemistry, ML represents a paradigm shift away from WFT (Figure 2a) and DFT (Figure 2b). It provides a completely new way to solve the Schrödinger equation: To solve the equation without solving it! Our argument is based on the following two observations. First, ample evidence from the recent literature indicates that ML can accurately reproduce, and even predict, the total energy E,55,56 total wave function Ψ,57 and all kinds of properties {Pi} of molecular systems,58−60 suggesting that the solution of their Schrödinger equations can be accurately obtained and thus the equation can be implicitly solved by ML. Second, it is well-recognized that the hardware development of digital computers has tremendously boosted the implementation of two schemes in Figure 2 to perform the one-to-many mapping for molecules and condensed matters. With the collective development of both hardware and software in recent decades, computer hardware is fast enough, and computer software becomes smart enough, so it has become feasible now for computers to solve the Schrödinger equation without us explicitly programming it.
Figure 3 shows the mapping from chemical space to property space using ML. The key for this mapping to take place and work well is the choice of the feature set, {ajk}, which is to be trained by the training set and applied to make predictions for the test set. This feature set should (i) be size-extensive, (ii) be self-adaptive, (iii) be physically explainable, and (iv) be able to reproduce the electron density. Size-extensiveness enables the trained models to be generalizable to larger systems, and self-adaptiveness takes into account the change of the local environment for atoms in molecules or solids. The feature set will also be employed for the purpose of improving chemical understanding, so it must be physically explainable. The last requirement of the feature set is the criterion based on DFT. If the electron density in the ground state is known, according to the basic theorems of DFT,3,33 everything else about the system can also be rigorously determined. Examples of descriptors satisfying the last criterion include atom-condensed shape functions,61 moments,62 or information entropy.63 This last requirement guarantees that the mapping in Figure 3 is well established, and that the feature set also plays the role of quality control. Lack of meeting these requirements all together for ML feature sets will impede the transferability, universality, and interpretability of ML models. Even though there are many kinds of widely used feature sets in the present literature,54 none of them is found to satisfy all these four requirements yet.
Figure 3.

Many-to-many mapping from chemical space to property space through the feature set {ajk} and deep neural network in machine learning.
The reason why the ML-based mapping in Figure 3 is many-to-many is because one starts with a training set of many inputs and ends up with the outcome of many predictions. This many-to-many mapping not only provides a new pathway to accomplish the one-to-many mapping shown in Figure 2 for WFT and DFT, but also offers desirable opportunities to exploit the many-to-one mapping required by the inverse molecular design, which finds crucial applications in drug discovery and catalyst design.64,65
V. The Coming Era of Quantum Computers
Quantum computer(s) is a computing device using quantum mechanics. Its origin can be attributed to Feynman,66 Manin,67 and Benioff,68 who independently proposed the idea of using quantum mechanics to perform quantum calculations. Unlike classical computers whose information is stored in bits whose value can be either 0 or 1, the basic information unit of QC, quantum bit or qubit, can simultaneously exist in the superposition state of both 0 and 1. On the other hand, QC makes use of coherence and entanglement properties from quantum mechanics for multiple qubits, allowing it to simultaneously explore qubit space and thus achieve exponential speedups for a variety of computations. Even though quantum supremacy of QC has already been demonstrated in the literature69 and we are certain that it has the potential to revolutionize many fields including theoretical and computational chemistry, QC is still in the very early stage of development right now, the so-called noisy intermediate-scale quantum (NISQ) era.70 New quantum algorithms and applications in quantum simulations are to be unveiled as larger qubit-number and longer coherence-time QC device is developed.
Applying QC to solve the Schrödinger equation for molecules employed the VQE (variational quantum eigensolver) algorithm.71 It variationally minimizes the expectation value of the Hamiltonian for molecular systems with ansatz (trial wave functions). It does so in a hybrid manner. VQE couples a classical optimization loop with a subroutine that computes the expectation value on a QC apparatus. As of now, VQE has been successfully implemented for several small molecules such as H2, LiH, H12, etc.72,73
Even though it is still decades away for us to use QC for routine quantum simulations, this new technology presents to us a potential paradigm shift that will fundamentally change how the Schrödinger equation is solved. The QC device beyond the NISQ era will have millions of qubits, much longer coherence time, and much better error correction, gate fidelity, and fault tolerance capabilities. Also, as QC hardware advances, new and powerful quantum algorithms will emerge to take full advantage of the unique properties of the QC device. A significantly improved VQE algorithm is expected. Even a complete replacement of this algorithm is not impossible.
Besides VQE, one area of QC developments in the next few decades should be closely watched. This is quantum machine learning (QML).74,75 QML harnesses the unique capabilities of QC to enhance the performance and capabilities of ML algorithms. One plus one is surely greater than two. QML holds immense potential for quantum simulations in drug discovery, catalyst design, materials science and engineering, and many others.
VI. How to Harvest Chemical Understanding
We need to compute for sure, but we should also understand. That was the point that we made in Figure 1b as the ultimate challenge of in silico simulations. Significant progress has been accomplished in the past decades along the computation axis using multiscale modeling techniques. Nevertheless, how to harvest chemical understanding from computation has never been adequately addressed and appropriately emphasized. In Figure 4, we present a systematic scheme to describe how chemical understanding can be harnessed out of computations from different frameworks. Each square in the figure represents a projection of the entire chemical space onto a particular framework characterized by the basic variable of the theory. For example, in WFT, as shown in Figure 2a, we employed its basic variable, molecular or bond orbitals, {Øi}, to appreciate chemical understanding, so the square is featured by the orbitals in Figure 4. Using the orbitals, we can obtain a better understanding about covalent bond and chemical reactivity in terms of, for instance, FMO theory and Woodward–Hoffmann rules. In DFT, its basic variable is the electron density ρ, so the plane in Figure 4 is symbolized by the density ρ. As shown in Figure 2b, we can employ density-related quantities to identify, determine, and even quantify bonding, stability, reactivity, and other chemical concepts. These include strong covalent bonds, weak interactions, acidity, aromaticity, steric effect, electrophilicity, nucleophilicity, regioselectivity, stereoselectivity, etc. These two frameworks of chemical understanding shown as the squares with green and purple sides in Figure 4 resulted from the projection of the chemical space onto WFT and DFT frameworks. These two squares represent two different manners to understand chemical concepts from conventional wisdom. These understandings are not mutually exclusive to each other. Instead, they are orthogonal and complementary to each other, representing different views for the same species in chemical space.
Figure 4.

Schematic representation of how chemical understanding can be harnessed from wave function theory, density functional theory, machine learning, and quantum computer using orbitals {Øi}, electron density ρ, features {ajk}, and qubits {qi}, respectively.
For ML, its basic variable is the feature set, {ajk}. This set of features is the quantities that future chemical understandings will be exploited. For example, following FMO theory in WFT, we may look for the single or few features that play the most important role. Alternatively, following DFT, we may borrow Shannon entropy, Fisher information, or other information-theoretic quantities50−52 for the purpose. Since current feature sets available in the literature do not meet all four criteria that we specified above, we do not exactly know yet what novel chemical understandings can be obtained from ML. However, we know what and how we should expect from ML when all the criteria of features are met, and the paradigm shift is accomplished.
The situation is the same for QC, whose basic variable of modeling is qubits, {qi}. Qubits will be the quantities to be exploited to obtain new chemical understandings from QC. We may employ the same strategy as ML to search for new understanding. Figure 4 also shows the additional two squares with red and blue sides, respectively, representing the complementary and orthogonal roles of feature sets {ajk} in ML and qubits {qi} in QC to harvest chemical understandings. Again, these understanding planes are not mutually exclusive. They provide news insights not accessible from WFT and DFT frameworks.
VII. Outlook: Hierarchical Modeling
Looking ahead, we envision that ML and QC will make it possible to perform hierarchical modeling across multiple scales in theoretical and computational chemistry, as shown in Figure 5. Not new in other disciplines such as computer science and statistics yet to be formally introduced and thoroughly explored in theoretical and computational chemistry, hierarchical modeling is in stark contrast to multiscale modeling. Multiscale modeling is a bottom-up approach that starts with fine-grained models for the lower scale and then gradually aggregates to coarse-grained models for the upper scale. On the contrary, hierarchical modeling is a top-down approach whose components across different hierarchical levels are associated with one another in a nested or disjointed manner. In hierarchical modeling, more attention is paid to the relationship among components at a given hierarchical level or across different hierarchical levels.
Figure 5.

Impact of machine learning and quantum computer on hierarchical modeling.
Historically, hierarchy is known as reductionism. However, hierarchical modeling can be a combination of reductionism and holism. It offers a flexible framework for representing complex systems and allows for both decomposition and integration at different hierarchical levels. Hierarchical modeling is particularly suited to capturing and studying emergent properties at higher hierarchical levels that arise from interactions among components at lower levels. These emergent properties are not directly predictable from the properties of individual components at lower hierarchical levels, so they go beyond reductionism and align with holism.
The reason why hierarchical modeling will emerge and thrive in the ML and QC era is because ML and QC can accomplish the many-to-many mapping in Figure 4 for a given hierarchical level. With this done, more attention can be shifted to and then focused on the relationship among different components or levels of hierarchical structures. There are many kinds of hierarchy in nature, such as structure hierarchy, data hierarchy, chirality hierarchy,76,77 taxonomy hierarchy, organization hierarchy, etc. The new modeling approach is aimed at dealing with hierarchical structures, which are prevalent in nature, from atoms to molecules to cells to tissues to organs to humans to societies to ecosystems to the solar system to the Milky Way. Hierarchical modeling captures the impact of one hierarchical level influenced by others, so this approach is particularly insightful and productive when dealing with hierarchical structures that exhibit patterns and principles across multiple hierarchical levels. These hierarchical structures are often bounded together through weak interactions, where the effect of cooperation and frustration are ubiquitous,78−80 and the examination and understanding about the concepts of synergy, cybernetics, self-organization, emergence, complexity, and evolution from both reductionistic and holistic perspectives will become inevitable.81−84
Moreover, with the general scheme in Figure 4 on how chemical understanding can be harnessed from different frameworks, novel insights pertaining to chemical and biological processes can be harvested through the fundamental descriptors across different hierarchical levels in complicated phenomena. These phenomena could include, but are not limited to, macromolecular self-assembly, asymmetrical synthesis, enzymic catalysis, and many more. This is done through the feature set in ML or QC or QML from different levels of hierarchical structures. If the same feature set can be applied to describe different levels of a hierarchical structure, this structure exhibits the key characteristics of a scale-free network85 in a holographic manner, which has found profound implications in nature such as protein–protein interactions, gene regulatory networks, and the World Wide Web.
To wrap up, we recall that, in 1929, the late U.K. theoretical physicist and Nobel Laureate Paul A. M. Dirac claimed that “the underlying physical laws necessary for the mathematical theory of...the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.”86 Based on what we have presented in previous sections, after about one century, we finally foresee plausible pathways to tackle this problem. ML and QC will assist us in overcoming Dirac’s above pessimistic view and provide viable options to make those “much too complicated” equations soluble. We do not solve them analytically though. We will make use of artificial intelligence for this purpose. Plus, this may not happen in the next few years because there are still obstacles to conquer, but we are cautiously optimiztic that it will become likely in the next few decades.
Acknowledgments
The author is grateful to the Editor-in-Chief of this Journal for the kind invitation. Helpful discussion with Paul W. Ayers of McMaster University, Canada, Wenjian Liu of Shandong University, China, and Thijs Stuyver of PSL University in Paris, France, is acknowledged.
The author declares no competing financial interest.
Special Issue
Published as part of ACS Physical Chemistry Auvirtual special issue “Visions for the Future of Physical Chemistry in 2050”.
References
- Szabo A.; Ostlund N. S.. Modern Quantum Chemistry: Introduction to Advanced Electronic Structure Theory; Dover Books on Chemistry, 9780486691862; Dover Publications, 1996. [Google Scholar]
- Helgaker T.; Coriani S.; Jørgensen P.; Kristensen K.; Olsen J.; Ruud K. Recent Advances in Wave Function-Based Methods of Molecular-Property Calculations. Chem. Rev. 2012, 112, 543–631. 10.1021/cr2002239. [DOI] [PubMed] [Google Scholar]
- Parr R.; Yang W.. Density Functional Theory of Atoms and Molecules; Oxford University: New York, 1989. [Google Scholar]
- Teale A. M.; Helgaker T.; Savin A.; Adamo C.; Aradi B.; Arbuznikov A. V.; Ayers P. W.; Baerends E. J.; Barone V.; Calaminici P.; et al. DFT Exchange: Sharing Perspectives on the Workhorse of Quantum Chemistry and Materials Science. Phys. Chem. Chem. Phys. 2022, 24, 28700–28781. 10.1039/D2CP02827A. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Horstemeyer M. F.Multiscale Modeling: A Review. In Practical Aspects of Computational Chemistry; Springer: Berlin, 2009; pp 87–135. [Google Scholar]
- Bulo R. E.; Michel C.; Fleurat-Lessard P.; Sautet P. Multiscale Modeling of Chemistry in Water: Are We There Yet?. J. Chem. Theory Comput. 2013, 9, 5567–5577. 10.1021/ct4005596. [DOI] [PubMed] [Google Scholar]
- Fukui K.; Yonezawa T.; Shingu H. A Molecular Orbital Theory of Reactivity in Aromatic Hydrocarbons. J. Chem. Phys. 1952, 20, 722–725. 10.1063/1.1700523. [DOI] [Google Scholar]
- Fukui K. Role of Frontier Orbitals in Chemical Reactions. Science 1982, 218, 747–754. 10.1126/science.218.4574.747. [DOI] [PubMed] [Google Scholar]
- Geerlings P.; De Proft F.; Langenaeker W. Conceptual Density Functional Theory. Chem. Rev. 2003, 103, 1793–1873. 10.1021/cr990029p. [DOI] [PubMed] [Google Scholar]
- Liu S. B. Conceptual Density Functional Theory and Some Recent Developments. Acta. Phys-Chim. Sin. 2009, 25, 590–600. 10.3866/PKU.WHXB20090332. [DOI] [Google Scholar]
- Geerlings P.; Chamorro E.; Chattaraj P. K.; De Proft F.; Gázquez J. L.; Liu S.; Morell C.; Toro-Labbé A.; Vela A.; Ayers P. Conceptual Density Functional Theory: Status, Prospects, Issues. Theor. Chem. Acc. 2020, 139, 36. 10.1007/s00214-020-2546-7. [DOI] [Google Scholar]
- Liu S. B., Ed. Conceptual Density Functional Theory: Towards a New Chemical Reactivity Theory; Wiley-VCH GmbH: Germany, Apr 2022. [Google Scholar]
- Goh G. B.; Hodas N. O.; Vishnu A. Deep Learning for Computational Chemistry. J. Comput. Chem. 2017, 38, 1291–1307. 10.1002/jcc.24764. [DOI] [PubMed] [Google Scholar]
- Butler K. T.; Davies D. W.; Cartwright H.; Isayev O.; Walsh A. Machine Learning for Molecular and Materials Science. Nature 2018, 559, 547–555. 10.1038/s41586-018-0337-2. [DOI] [PubMed] [Google Scholar]
- Mater A. C.; Coote M. L. Deep Learning in Chemistry. J. Chem. Inf. Model. 2019, 59, 2545–2559. 10.1021/acs.jcim.9b00266. [DOI] [PubMed] [Google Scholar]
- Meuwly M. Machine Learning for Chemical Reactions. Chem. Rev. 2021, 121, 10218–10239. 10.1021/acs.chemrev.1c00033. [DOI] [PubMed] [Google Scholar]
- Baum Z. J.; Yu X.; Ayala P. Y.; Zhao Y.; Watkins S. P.; Zhou Q. Artificial Intelligence in Chemistry: Current Trends and Future Directions. J. Chem. Inf. Model. 2021, 61, 3197–3212. 10.1021/acs.jcim.1c00619. [DOI] [PubMed] [Google Scholar]
- Keith J. A.; Vassilev-Galindo V.; Cheng B.; Chmiela S.; Gastegger M.; Müller K.-R.; Tkatchenko A. Combining Machine Learning and Computational Chemistry for Predictive Insights into Chemical Systems. Chem. Rev. 2021, 121, 9816–9872. 10.1021/acs.chemrev.1c00107. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xia S.; Chen E.; Zhang Y. Integrated Molecular Modeling and Machine Learning for Drug Design. J. Chem. Theory Comput. 2023, 19, 7478–7495. 10.1021/acs.jctc.3c00814. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aspuru-Guzik A.; Dutoi A. D.; Love P. J.; Head-Gordon M. Simulated Quantum Computation of Molecular Energies. Science 2005, 309, 1704–1707. 10.1126/science.1113479. [DOI] [PubMed] [Google Scholar]
- Kassal I.; Whitfield J. D.; Perdomo-Ortiz A.; Yung M.-H.; Aspuru-Guzik A. Simulating Chemistry Using Quantum Computers. Annu. Rev. Phys. Chem. 2011, 62, 185. 10.1146/annurev-physchem-032210-103512. [DOI] [PubMed] [Google Scholar]
- Cao Y.; Romero J.; Olson J. P.; Degroote M.; Johnson P. D.; Kieferová M.; Kivlichan I. D.; Menke T.; Peropadre B.; Sawaya N. P. D.; Sim S.; Veis L.; Aspuru-Guzik A. Quantum Chemistry in the Age of Quantum Computing. Chem. Rev. 2019, 119, 10856–10915. 10.1021/acs.chemrev.8b00803. [DOI] [PubMed] [Google Scholar]
- McArdle S.; Endo S.; Aspuru-Guzik A.; Benjamin S. C.; Yuan X. Quantum Computational Chemistry. Rev. Mod. Phys. 2020, 92, 015003. 10.1103/RevModPhys.92.015003. [DOI] [Google Scholar]
- Bauer B.; Bravyi S.; Motta M.; Chan G. K.-L. Quantum Algorithms for Quantum Chemistry and Quantum Materials Science. Chem. Rev. 2020, 120, 12685–12717. 10.1021/acs.chemrev.9b00829. [DOI] [PubMed] [Google Scholar]
- Ollitrault P. J.; Miessen A.; Tavernelli I. Molecular Quantum Dynamics: A Quantum Computing Perspective. Acc. Chem. Res. 2021, 54, 4229–4238. 10.1021/acs.accounts.1c00514. [DOI] [PubMed] [Google Scholar]
- Motta M.; Rice J. E. Emerging Quantum Computing Algorithms for Quantum Chemistry. Wiley Interdiscip. Rev.: Comput. Mol. Sci. 2022, 12, e1580 10.1002/wcms.1580. [DOI] [Google Scholar]
- Liu S. B.; Rong C.; Lu T.; Hu H. Identifying Strong Covalent Interactions with Pauli Energy. J. Phys. Chem. A 2018, 122, 3087–3095. 10.1021/acs.jpca.8b00521. [DOI] [PubMed] [Google Scholar]
- Zhong S.; He X.; Liu S.; Wang B.; Lu T.; Rong C.; Liu S. B. Toward Density-Based and Simultaneous Description of Chemical Bonding and Noncovalent Interactions with Pauli Energy. J. Phys. Chem. A 2022, 126, 2437–2444. 10.1021/acs.jpca.2c00224. [DOI] [PubMed] [Google Scholar]
- Zhang W.; He X.; Li M.; Zhang J.; Zhao D.; Liu S. B.; Rong C. Simultaneous Identification of Strong and Weak Interactions with Pauli Energy, Pauli Potential, Pauli Force, and Pauli Charge. J. Chem. Phys. 2023, 159, 184104. 10.1063/5.0173666. [DOI] [PubMed] [Google Scholar]
- Woodward R. B.; Hoffmann R. Stereochemistry of Electrocyclic Reactions. J. Am. Chem. Soc. 1965, 87, 395–397. 10.1021/ja01080a054. [DOI] [Google Scholar]
- Hoffmann R.; Woodward R. B. Selection Rules for Concerted Cycloaddition Reactions. J. Am. Chem. Soc. 1965, 87, 2046–2048. 10.1021/ja01087a034. [DOI] [Google Scholar]
- Geerlings P.; Ayers P. W.; Toro-Labbé A.; Chattaraj P. K.; De Proft F. The Woodward-Hoffmann Rules Reinterpreted by Conceptual Density Functional Theory. Acc. Chem. Res. 2012, 45, 683–695. 10.1021/ar200192t. [DOI] [PubMed] [Google Scholar]
- Hohenberg P.; Kohn W. Inhomogeneous Electron Gas. Phys. Rev. 1964, 136, B864–B871. 10.1103/PhysRev.136.B864. [DOI] [Google Scholar]
- Kohn W.; Sham L. J. Self-Consistent Equations Including Exchange and Correlation Effects. Phys. Rev. 1965, 140, A1133–A1138. 10.1103/PhysRev.140.A1133. [DOI] [Google Scholar]
- Mi W.; Luo K.; Trickey S. B.; Pavanello M. Orbital-Free Density Functional Theory: An Attractive Electronic Structure Method for Large-Scale First-Principle Simulations. Chem. Rev. 2023, 123, 12039–12104. 10.1021/acs.chemrev.2c00758. [DOI] [PubMed] [Google Scholar]
- Horowitz C. M.; Proetto C. R.; Pitarke J. M. Orbital-Free Density Functional Theory for Metal Slabs. J. Chem. Phys. 2023, 159, 164112. 10.1063/5.0169977. [DOI] [PubMed] [Google Scholar]
- Gangwar A.; Bulusu S. S.; Banerjee A. Neural Network Learned Pauli Potential for the Advancement of Orbital-Free Density Functional Theory. J. Chem. Phys. 2023, 159, 124114. 10.1063/5.0165524. [DOI] [PubMed] [Google Scholar]
- Yang W.; Parr R. G. Hardness, Softness, and the Fukui Function in the Electronic Theory of Metals and Catalysis. Proc. Natl. Acad. Sci. U.S.A. 1985, 82, 6723–6726. 10.1073/pnas.82.20.6723. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parr R. G.; Yang W. Density Functional Approach to the Frontier-Electron Theory of Chemical Reactivity. J. Am. Chem. Soc. 1984, 106, 4049–4050. 10.1021/ja00326a036. [DOI] [Google Scholar]
- Parr R. G.; Szentpály L. v.; Liu S. Electrophilicity Index. J. Am. Chem. Soc. 1999, 121, 1922–1924. 10.1021/ja983494x. [DOI] [Google Scholar]
- Morell C.; Grand A.; Toro-Labbé A. New Dual Descriptor for Chemical Reactivity. J. Phys. Chem. A 2005, 109, 205–212. 10.1021/jp046577a. [DOI] [PubMed] [Google Scholar]
- Liu S. B.; Schauer C. K.; Pedersen L. G. Molecular Acidity: A Quantitative Conceptual Density Functional Theory Description. J. Chem. Phys. 2009, 131, 164107. 10.1063/1.3251124. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feng X. T.; Yu J. G.; Lei M.; Fang W. H.; Liu S. B. Toward Understanding Metal-Binding Specificity of Porphyrin: A Conceptual Density Functional Theory Study. J. Phys. Chm. B 2009, 113, 13381–13389. 10.1021/jp905885y. [DOI] [PubMed] [Google Scholar]
- Liu S. B.; Ess D. H.; Schauer C. K. Density Functional Reactivity Theory Characterizes Charge Separation Propensity in Proton-Coupled Electron Transfer Reactions. J. Phys. Chem. A 2011, 115, 4738–4742. 10.1021/jp112319d. [DOI] [PubMed] [Google Scholar]
- Kumar N.; Liu S. B.; Kozlowski P. M. Charge Separation Propensity of the Cozenyme B12-Tyrosine Complex in Adenosylcobalamin-Dependent Ethulmalonyl-CoA Mutase Enzyme. J. Phys. Chem. Lett. 2012, 3, 1035–1038. 10.1021/jz300102s. [DOI] [PubMed] [Google Scholar]
- Liu S. Steric effect: A quantitative description from density functional theory. J. Chem. Phys. 2007, 126, 244103. 10.1063/1.2747247. [DOI] [PubMed] [Google Scholar]
- Liu S.; Rong C.; Lu T. Information Conservation Principle Determines Electrophilicity, Nucleophilicity, and Regioselectivity. J. Phys. Chem. A 2014, 118, 3698–3704. 10.1021/jp5032702. [DOI] [PubMed] [Google Scholar]
- Liu S. B.; Rong C. Y.; Lu T. Electronic forces as descriptors of nucleophilic and electrophilic regioselectivity and stereoselectivity. Phys. Chem. Chem. Phys. 2017, 19, 1496–1503. 10.1039/C6CP06376D. [DOI] [PubMed] [Google Scholar]
- Liu S. B.; Liu L.; Yu D. H.; Rong C. Y.; Lu T. Steric charge. Phys. Chem. Chem. Phys. 2018, 20, 1408–1420. 10.1039/C7CP07678A. [DOI] [PubMed] [Google Scholar]
- Liu S. B. Information-theoretic approach in density functional reactivity theory. Acta Phys.-Chim. Sin. 2016, 32, 98–118. 10.3866/PKU.WHXB201510302. [DOI] [Google Scholar]
- Rong C. Y.; Wang B.; Zhao D. B.; Liu S. B. Information-theoretic approach in density functional theory and its recent applications to chemical problems. WIREs Comp. Mol. Sci. 2020, 10, e1461 10.1002/wcms.1461. [DOI] [Google Scholar]
- Rong C.; Zhao D.; He X.; Liu S. B. Development and Applications of the Density-Based Theory of Chemical Reactivity. J. Phys. Chem. Lett. 2022, 13, 11191–11200. 10.1021/acs.jpclett.2c03165. [DOI] [PubMed] [Google Scholar]
- Liu S. B., Ed. Exploiting Chemical Concepts through Theory and Computation; Wiley-VCH GmbH: Germany, Feb 2024. [Google Scholar]
- Musil F.; Grisafi A.; Bartok A. P.; Ortner C.; Csanyi G.; Ceriotti M. Physics-Inspired Structural Representations for Molecules and Materials. Chem. Rev. 2021, 121, 9759–9815. 10.1021/acs.chemrev.1c00021. [DOI] [PubMed] [Google Scholar]
- Smith J. S.; Isayev O.; Roitberg A. E. ANI-1: An Extersible Neutral network Potential with DFT Accurarcy at Force Field Computational Cost. Chem. Sci. 2017, 8, 3192. 10.1039/C6SC05720A. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Behler J. Four Generations of High-Dimensional Neural Network Potentials. Chem. Rev. 2021, 121, 10037–10072. 10.1021/acs.chemrev.0c00868. [DOI] [PubMed] [Google Scholar]
- Hermann J.; Schätzle Z.; Noé F. Deep-Neural-network Solution of the Electronic Schroödinger Equation. Nat. Chem. 2020, 12, 891–897. 10.1038/s41557-020-0544-y. [DOI] [PubMed] [Google Scholar]
- Chen X.; Li P.; Hruska E.; Liu F. Δ-Machine Learning for Quantum Chemistry Predicton of Solution-Phase Molecular Properties at the Ground and Excited States. Phys. Chem. Chem. Phys. 2023, 25, 13417–13428. 10.1039/D3CP00506B. [DOI] [PubMed] [Google Scholar]
- Shilpa S.; Kashyap G.; Sunoj R. B. Recent Applicaions of Machine Learning in Molecular Property and Chemical Reaction Outcome Predictions. J. Phys. Chem. A 2023, 127, 8253–8271. 10.1021/acs.jpca.3c04779. [DOI] [PubMed] [Google Scholar]
- Ye S.; Zhong K.; Zhang J.; Hu W.; Hirst J. D.; Zhang G.; Mukamel S.; Jiang J. A Machine Learning Protocol for Predicting Protein Infrared Spectra. J. Am. Chem. Soc. 2020, 142, 19071–19077. 10.1021/jacs.0c06530. [DOI] [PubMed] [Google Scholar]
- Ayers P. W. Density per Particle as a Descriptor of Coulombic Systems. Proc. Natl. Acad. Sci. U.S.A. 2000, 97, 1959–1964. 10.1073/pnas.040539297. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu S. B.; Nagy A.; Parr R. G. Expansion of the Density-Functional Energy Components Ec and Tc in terms of Moments of the Electron Density. Phys. Rev. A 1999, 59, 1131. 10.1103/PhysRevA.59.1131. [DOI] [Google Scholar]
- Zhou X.-Y.; Rong C.; Lu T.; Zhou P.; Liu S. Information Functional Theory: Electronic Properties as Functionals of Information for Atoms and Molecules. J. Phys. Chem. A 2016, 120, 3634–3642. 10.1021/acs.jpca.6b01197. [DOI] [PubMed] [Google Scholar]
- Sanchez-Lengeling B.; Aspuru-Guzik A. Inverse Molecular Design Using Machine Learning: Generative Models for Matter Engineering. Science 2018, 361, 360–365. 10.1126/science.aat2663. [DOI] [PubMed] [Google Scholar]
- Anstine D. M.; Isayev O. Generative Models as an Emerging Paradigm in the Chemical Sciences. J. Am. Chem. Soc. 2023, 145, 8736–8750. 10.1021/jacs.2c13467. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feynman R. Simulating Physics with Computers. Int. J. Theor. Phys. 1982, 21, 467–488. 10.1007/BF02650179. [DOI] [Google Scholar]
- Manin Y. I.Vychislimoe i nevychislimoe [Computable and Noncomputable] (in Russian). Soviet Radio. pp. 13–15. 1980. [Google Scholar]
- Benioff P. The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines. J. Stat. Phys. 1980, 22, 563–591. 10.1007/BF01011339. [DOI] [Google Scholar]
- Arute F.; Arya K.; Babbush R.; Bacon D.; Bardin J. C.; Barends R.; Biswas R.; Boixo S.; Brandao F. G. S. L.; Buell D. A.; Burkett B.; Chen Y.; Chen Z.; Chiaro B.; Collins R.; Courtney W.; Dunsworth A.; Farhi E.; Foxen B.; Fowler A.; Gidney C.; Giustina M.; Graff R.; Guerin K.; Habegger S.; Harrigan M. P.; Hartmann M. J.; Ho A.; Hoffmann M.; Huang T.; Humble T. S.; Isakov S. V.; Jeffrey E.; Jiang Z.; Kafri D.; Kechedzhi K.; Kelly J.; Klimov P. V.; Knysh S.; Korotkov A.; Kostritsa F.; Landhuis D.; Lindmark M.; Lucero E.; Lyakh D.; Mandra S.; McClean J. R.; McEwen M.; Megrant A.; Mi X.; Michielsen K.; Mohseni M.; Mutus J.; Naaman O.; Neeley M.; Neill C.; Niu M. Y.; Ostby E.; Petukhov A.; Platt J. C.; Quintana C.; Rieffel E. G.; Roushan P.; Rubin N. C.; Sank D.; Satzinger K. J.; Smelyanskiy V.; Sung K. J.; Trevithick M. D.; Vainsencher A.; Villalonga B.; White T.; Yao Z. J.; Yeh P.; Zalcman A.; Neven H.; Martinis J. M. Quantum Supremacy Using a Programmable Superconducting Processor. Nature 2019, 574, 505–510. 10.1038/s41586-019-1666-5. [DOI] [PubMed] [Google Scholar]
- Preskill J. Quantum Computing in the NISQ Era and Beyond. Quantum 2018, 2, 79. 10.22331/q-2018-08-06-79. [DOI] [Google Scholar]
- Tilly J.; Chen H.; Cao S.; Picozzi D.; Setia K.; Li Y.; Grant E.; Wossnig L.; Rungger I.; Booth G. H.; Tennyson J. The Variational Quantum Eigensolver: A Review of Methods and Best Practices. Phys. Reports. 2022, 986, 1–128. 10.1016/j.physrep.2022.08.003. [DOI] [Google Scholar]
- Kandala A.; Mezzacapo A.; Temme K.; Takita M.; Brink M.; Chow J. M.; Gambetta J. M. Hardware-Efficient Variational Quantum Eigensolver for Small Molecules and Quantum Magnets. Nature 2017, 549, 242–246. 10.1038/nature23879. [DOI] [PubMed] [Google Scholar]
- Arute F.; Arya K.; Babbush R.; Bacon D.; Bardin J. C.; Barends R.; Boixo S.; Broughton M.; Buckley B. B.; Buell D. A.; Burkett B.; Bushnell N.; Chen Y.; Chen Z.; Chiaro B.; Collins R.; Courtney W.; Demura S.; Dunsworth A.; Farhi E.; Fowler A.; Foxen B.; Gidney C.; Giustina M.; Graff R.; Habegger S.; Harrigan M. P.; Ho A.; Hong S.; Huang T.; Huggins W. J.; Ioffe L.; Isakov S. V.; Jeffrey E.; Jiang Z.; Jones C.; Kafri D.; Kechedzhi K.; Kelly J.; Kim S.; Klimov P. V.; Korotkov A.; Kostritsa F.; Landhuis D.; Laptev P.; Lindmark M.; Lucero E.; Martin O.; Martinis J. M.; McClean J. R.; McEwen M.; Megrant A.; Mi X.; Mohseni M.; Mruczkiewicz W.; Mutus J.; Naaman O.; Neeley M.; Neill C.; Neven H.; Niu M. Y.; O’Brien T. E.; Ostby E.; Petukhov A.; Putterman H.; Quintana C.; Roushan P.; Rubin N. C.; Sank D.; Satzinger K. J.; Smelyanskiy V.; Strain D.; Sung K. J.; Szalay M.; Takeshita T. Y.; Vainsencher A.; White T.; Wiebe N.; Yao Z. J.; Yeh P.; Zalcman A. Hartree-Fock on a Superconducting Qubit Quantum Computer. Science 2020, 369, 1084–1089. 10.1126/science.abb9811. [DOI] [PubMed] [Google Scholar]
- Biamonte J.; Wittek P.; Pancotti N.; Rebentrost P.; Wiebe N.; Lloyd S. Quantm Machine Learing. Nature 2017, 549, 195–202. 10.1038/nature23474. [DOI] [PubMed] [Google Scholar]
- Sajjan M.; Li J.; Selvarajan R.; Sureshbabu S. H.; Kale S. S.; Gupta R.; Singh V.; Kais S. Quantum Machine Learning for Chemistry and Physics. Chem. Soc. Rev. 2022, 51, 6475–6573. 10.1039/D2CS00203E. [DOI] [PubMed] [Google Scholar]
- Liu S. B. Homochirality originates from the handedness of helices. J. Phys. Chem. Lett. 2020, 11, 8690–8696. 10.1021/acs.jpclett.0c02144. [DOI] [PubMed] [Google Scholar]
- Liu S. B. Principle of chirality hierarchy in three-blade propeller systems. J. Phys. Chem. Lett. 2021, 12, 8720–8725. 10.1021/acs.jpclett.1c02433. [DOI] [PubMed] [Google Scholar]
- Rong C. Y.; Zhao D. B.; Yu D. H.; Liu S. B. Quantification and origin of cooperativity: Insights from density functional reactivity theory. Phys. Chem. Chem. Phys. 2018, 20, 17990–17998. 10.1039/C8CP03092H. [DOI] [PubMed] [Google Scholar]
- Rong C. Y.; Zhao D. B.; Zhou T. J.; Liu S. Y.; Yu D. H.; Liu S. B. Homogeneous molecular systems are positively cooperative but charged molecular systems are negatively cooperative. J. Phys. Chem. Lett. 2019, 10, 1716–1721. 10.1021/acs.jpclett.9b00639. [DOI] [PubMed] [Google Scholar]
- Liu S. B.; Rong C. Y. Quantifying frustrations for molecular complexes with noncovalent interactions. J. Phys. Chem. A 2021, 125, 4910–1947. 10.1021/acs.jpca.1c02690. [DOI] [PubMed] [Google Scholar]
- Ashby W. R.An Introduction to Cybernetics; Chapman & Hall: London, 1957. [Google Scholar]
- Fuchs C.Self-organization and Knowledge Management; Springer: New York, 2005. [Google Scholar]
- Heylighen F.Complexity and Evolution: Fundamental Concepts of a New Scientific Worldview; Vrije Universiteit Brussel: Brussels, 2018. [Google Scholar]
- Tabilo Alvarez J.; Ramirez-Correa P. A Brief Review of Systems, Cybernetics, and Complexity. Complexity 2023, 2023, 8205320. 10.1155/2023/8205320. [DOI] [Google Scholar]
- Barabási A.-L.; Albert R. Emergence of Scaling in Random Networks. Science 1999, 286, 509–512. 10.1126/science.286.5439.509. [DOI] [PubMed] [Google Scholar]
- Simões A. Dirac’s Claim and the Chemists. Phys. in Perspective 2002, 4, 253–266. 10.1007/s00016-002-8369-1. [DOI] [Google Scholar]

