Skip to main content
Springer logoLink to Springer
. 2018 Jul 19;24(8):212. doi: 10.1007/s00894-018-3699-3

Information equilibria, subsystem entanglement, and dynamics of the overall entropic descriptors of molecular electronic structure

Roman F Nalewajski 1,
PMCID: PMC6061096  PMID: 30027486

Abstract

Overall descriptors of the information (determinicity) and entropy (uncertainty) content of complex molecular states are reexamined. These resultant concepts combine the classical (probability) contributions of Fisher and Shannon, and the relevant nonclassical supplements due to the state phase/current. The information-theoretic principles determining equilibria in molecules and their fragments are explored and the nonadditive part of the global entropy is advocated as a descriptor of the classical index of the quantum entanglement of molecular subsystems. Affinities associated with the probability and phase fluxes are identified and the criterion of vanishing overall information-source is shown to identify the system stationary electronic states. The production of resultant density of the gradient-information is expressed in terms of the conjugate affinities (forces, perturbations) and fluxes (currents, responses). The Schrödinger dynamics of probability and phase components of molecular electronic states is used to determine the temporal evolution of the overall gradient information and complex entropy. The global sources of the resultant information/entropy descriptors are shown to be of purely nonclassical origin, thus identically vanishing in real electronic states, e.g., the nondegenerate ground state of a molecule.

Keywords: Entropic principles/equilibria, Information theory, Probability/phase dynamics, Quantum entropy, Resultant entropy/information, Subsystem entanglement

Introduction

The electronic structure of molecules is embodied in their quantum states generating both the system particle density and current distributions. The continuity relation for the state probability density, which relates these two structural aspects of molecular wavefunctions, implies that the density dynamics is determined by the current’s divergence. To paraphrase Prigogine [1], while the electron density determines a static facet of “being”, the probability current reflects the state dynamic aspect of “becoming”. A general electronic wavefunction is a complex entity characterized by both its modulus and phase components. The square of the former determines the particle probability distribution marking the structure of “being”, while the gradient of the latter generates the state current density reflecting the structure of “becoming”. These two structural manifestations give rise to the associated classical and nonclassical contributions in the resultant measures of the information/entropy content in complex electronic states [2]. Both the probability and phase/current distributions carry partial information contributions to the resultant entropic content of the underlying quantum state of a molecule.

In entropic theories of molecular electronic structure, e.g., [25], one thus requires an appropriate quantum generalization [2] of the familiar classical descriptors of information theory (IT) [613], of the information content in the probability distribution. The quantum extensions [2] of the Fisher (gradient) [6, 7] and Shannon (global) [8, 9] measures, appropriate for complex amplitudes (wavefunctions) of molecular quantum mechanics (QM), combine the partial contributions due to probability (wavefunction modulus) and current (wavefunction phase) degrees-of-freedom. In the position representation the electron probability distribution p(r) alone generates the state classical amount of information, i.e., the information received from outcomes of incoherent (phase-unrelated) local events, outcomes of measurements of the particle position r. Their nonclassical complements in the resultant entropy/information measures, describing the coherent (phase-related) local events, generate the corresponding coherence entropy/information supplements [2, 1420] due to the state phase ϕ(r) or the current density j(r) ∝ ∇ϕ(r). Similar generalized descriptors of both the overall information content and entropy-deficiency (information-distance) [10, 11] can be introduced in the momentum space [2, 21].

The classical IT [613], an important branch of the applied probability theory, has already provided new insights into molecular electronic structure and generated useful descriptors of atoms in molecules, reactivity preferences and patterns of chemical bonds, e.g., [25]. The classical information terms are conceptually related to modern density functional theory (DFT) [2224]. They probe the entropic content of incoherent localization events, the outcomes of experiments measuring the particle position, while their nonclassical companions provide the information supplement due to the phase-coherence between such local events, which is inherent in general wavefunctions of QM. The familiar average information/entropy measures of Fisher and Shannon reflect only the information/entropy content in the system wavefunction due to the probability distribution; thus, failing to distinguish states exhibiting the same electron density but different current compositions.

Therefore, in the quantum IT (QIT) description of the phase equilibria in molecular systems and their constituent fragments [2, 1621, 2529] one has to unite both the probability and phase/current aspects of the system quantum states, in order to fully characterize the overall information content in molecular wavefunctions, the equilibrium states of both the system as a whole and its constituent parts, a degree of the quantum entanglement (mutual bonding status) of subsystems, or the electron diffusion processes [28]. The recently introduced resultant IT descriptors combine the classical probability contributions with their respective nonclassical supplements due to the state phase/current. The densities of nonclassical information/entropy terms exhibit the same mutual relations as their classical analogs and they introduce the nonvanishing source terms into their respective continuity relations [2, 26]. They have been successfully used to establish the phase equilibria in molecules, and to distinguish the mutually bonded (phase-related, “entangled”) status of molecular fragments and reactants from its nonbonded (phase-unrelated, “disentangled”) analog [2, 2731].

The complex global entropy [2, 25], the expectation value of a non-Hermitian operator, generates the probability and phase contributions in the resultant measure as its real and imaginary parts. This two-component (“vector”) extension satisfies the requirement that a classical dependence between densities-per-electron of the ordinary Shannon and Fisher entropy/information measures also covers the interrelation between their nonclassical supplements. The phase-dependent concept of complex entropy will be related to the Shannon entropy of information theory and von Neumann’s entropy in density matrix. The gradient entropy (indeterminicity-information) analog of the resultant Fisher (determinicity-information) descriptor has also been conjectured [2]. It combines the classical Fisher information with the negative nonclassical phase/current supplement. Indeed, the presence of a finite current introduces additional structure, “order” element, thus increasing the state information (determinicity) content and decreasing its entropy (uncertainty) property of electronic “disorder”.

Generalized information principles, formulated in terms of resultant IT descriptors, identify extrema of the global and gradient entropies, which mark the phase-equilibria in molecules [1420]. Such states exhibit a “thermodynamic” phase-shift related to the logarithm of probability density. The phase transformation defining the system equilibrium wavefunction affects both the local probability source and the net entropy production in the associated continuity equations [2, 18, 26]. Similar equilibria can be determined in the “entangled” (bonded) and “disentangled” (nonbonded) states of molecular fragments [2, 2731].

In this work relations between densities of the the complex entropy and resultant information measures will be reexamined and the phase and information equilibria, marking extrema of the resultant entropy and information measures, respectively, will be explored. The nonadditive part of the global entropy in the subsystem resolution will be advocated as a classical information descriptor of the quantum entanglement of the partition molecular fragments. The “vector” character of complex entropy density raises the natural question of what “scalar” function of its real (probability) and imaginary (phase) parts determines the information principle for establishing molecular phase-equilibria. This information quantity will be identified as the resultant gradient entropy.

The source term in the continuity relation for the overall gradient information will be examined and the underlying state affinities (“forces”) and fluxes (“responses”) will be identified; the equilibrium criterion of the vanishing information production will be shown to determine the stationary states of molecular QM. The time evolution of the QIT entropy/information measures will be explored using the the dynamical equations for the probability and phase components of electronic wavefunctions implied by the molecular Schrödinger Eq. (SE). The time dependence of the resultant information/entropy will be expressed in terms of the state probability and phase degrees-of-freedom, and the nonclassical origins of these derivatives will be revealed.

Probability and phase components of electronic states

Let us consider a single electron (N = 1) at time t0 = 0 in state |ψ(t0)〉 ≡ |ψ(0)〉 ≡ |ψ〉 described by the complex wave function in position-representation,

ψr0=rψ0=Rrexpiϕrψr, 1

where R(r) and ϕ(r) stand for its modulus and phase parts. It determines the probability distribution,

pr0=ψ0rrψ0ψρ^rψ=ψr*ψr=Rr2pr, 2

and its current

jr0=2m1ψ0ρrp+pρrψ0ψj^rψjr=ħ/2miψrψrψrψr=ħ/mImψrψr=ħ/mprϕrprVr; 3

here the momentum operator p^ is defined by its action on the wavefunction 〈r|p^ψ〉 = −iħψ(r), and the average velocity V(r) of the probability fluid, measuring the current-per-particle, reflects the state phase-gradient:

Vr=jr/pr=ħ/mϕr. 4

The wavefunction modulus, the classical amplitude of the particle probability density, and the state phase or its gradient, determining the effective velocity and probability flux, thus constitute two fundamental degrees-of-freedom in the full quantum IT treatment of electronic states: ψ ⇔ (R, ϕ) ⇔ (p, j).

One envisages the electron moving in the external potential v(r) due to the “frozen” nuclei of the molecule, described by the electronic Hamiltonian

H^r=ħ2/2m2+vrT^r+vr, 5

where T^r denotes its kinetic part. The quantum dynamics of a general electronic state |ψ(t)〉, giving rise to the associated wavefunction

ψrt=rψt=RrtexpiϕrtRtexpiϕtψt, 6

is generated by SE,

ψt/t=iħ1H^ψt, 7

which also determines temporal evolutions of the state two physical components: the instantaneous probability density p(r, t) = |ψ(r, t)|2 = R(r, t)2p(t), and the state phase ϕ(r, t) ≡ ϕ(t). The total time derivative of the former expresses the sourceless continuity relation for the probability distribution,

σprtdprt/dt=prt/t+jrt=prt/t+Vrtprt+prtVrt=prt/t+dr/dtprt/r=prt/t+Vrtprt=0or 8a
prt/t=jrt=ħ/2miψrtΔψrtψrtΔψrt=ħ/mϕrtprt+prt2ϕrt. 8b

The total derivative,

dprt/dt=prt/t+dr/dtprt/r=prt/t+Vrtprt, 9

determining the local probability “source” σp(r, t), has been interpreted above as the time rate of change in a moving infinitesimal volume element of the probability fluid, while the partial derivative ∂p(r, t)/∂t represents the corresponding rate at the fixed point in space. The probability continuity thus implies:

dprt/dt+prtVrt=0. 10

Thus, the vanishing probability source of Eq. (8a) also implies the vanishing divergence of the velocity field: ∇⋅V(r, t) = 0. This probability continuity equation also determines the dynamics of the state modulus component:

∂Rrt/t=ħ/mϕrtRrt+Rrt/2Δϕrt. 11

The particle effective velocity also determines the current concept associated with the state phase: J(r, t) = ϕ(r, t) V(r, t). The scalar field ϕ(r, t) and its conjugate current density J(r, t) determine a nonvanishing phase source [2] in the associated continuity equation:

σϕrtrt/dt=ϕrt/t+Jrt=ϕrt/t+Vrtϕrt0orϕrt/tσϕrt=Jrt=ħ/mϕrt2+ϕrtΔϕrt. 12

The phase dynamics from SE,

ϕ/t=ħ/2mR1ΔRϕ2v/ħ, 13

finally identifies the phase source:

σϕ=ħ/2mR1ΔR+ϕ2v/ħ. 14

As an illustration consider the stationary wavefunction corresponding to energy Es,

ψsrt=Rsrexpiϕst,ϕst=Es/ħt=ωst;psrt=Rsr2=psr,jsrt=Vsrt=0, 15

representing an eigenstate of the Hamiltonian:

H^rRsr=ħ2/2mΔRsr+vrRsr=EsRsr. 16

The phase dynamics of Eq. (13) then recovers the stationary SE and identifies a constant phase source:

ϕs/t=σϕ=ħ/2mRs1ΔRsv=ωs=const. 17

Resultant information/entropy concepts and uncertainty principle

At a given instant t = t0 the average Fisher [6] measure of the classical gradient information for locality events, contained in the molecular probability density p(r) = R(r)2, is reminiscent of von Weizsäcker’s [32] inhomogeneity correction to the kinetic energy functional in Thomas-Fermi theory,

graphic file with name 894_2018_3699_Equ19_HTML.gif 18

here p(r) = p(r) Ip(r) denotes the functional density and Ip(r) stands for the associated density-per-electron. The amplitude form I[R] reveals that this classical descriptor measures the average length of the modulus gradient ∇R. This classical, probability descriptor characterizes an effective “narrowness” of p(r), i.e., a degree of determinicity of the particle position.

The classical Shannon (S) [8] descriptor of the global entropy in p(r),

graphic file with name 894_2018_3699_Equ20_HTML.gif 19

similarly reflects the distribution “spread” (uncertainty), i.e., a degree of the position indeterminacy. It also provides the amount of information received, when this uncertainty is removed by an appropriate particle-localization experiment: IS[p] ≡ S[p]. The densities-per-electron of these complementary information and entropy functionals are seen to satisfy the classical relation:

Ipr=Spr2 20

These probability functionals of the classical information/entropy content generalize naturally into the corresponding resultant quantum descriptors combining the probability and phase/current contributions to the overall entropy/information descriptors of the electronic state |ψ〉 [2, 1420]. Such generalized concepts are applicable to complex wavefunctions of molecular QM. They are defined as average values of the associated operators: the Hermitian operator of the gradient information [33] related to the kinetic energy operator T^r,

I^r=4Δ=2i2=8m/ħ2T^r, 21

and the non-Hermitian (multiplicative) operator of the state complex entropy [25],

S^ψrt=2lnψrtSψrt=lnprt2iϕrt. 22

Therefore, their quantum expectation values in state |ψ〉 give rise to real and complex average IT descriptors, respectively.

The overall gradient infomation combines the classical (probability) and nonclassical (phase/current) contributions:

graphic file with name 894_2018_3699_Equ24_HTML.gif 23

This resultant (real) gradient information is proportional to the state average kinetic energy:

Tψ=ψ|T^|ψ=ħ2/8mIψ.

The resultant complex (“vector”) measure of the state global entropy is similarly determined by its real (classical) and imaginary (nonclassical) contributions:

graphic file with name 894_2018_3699_Equ25_HTML.gif 24

The densities-per-electron of these functionals,

Iψr=Ipr+2∇ϕr2Ipr+Iϕr=Ipr+2m/ħjr/pr2=Ipr+2m/ħVr2Ipr+Ijr 25

and

Hψr=Spr2iϕrSpr+iSϕr, 26

now satisfy the complex generalized relation

Iψr=Hψr2=Spr2+Sϕr2. 27

One also introduces the resultant gradient entropy [2], the state local uncertainty descriptor (indeterminicity-information),

graphic file with name 894_2018_3699_Equ29_HTML.gif 28

exhibiting a nonpositive phase supplement Mϕ(r) = − I[ϕ] in its overall density-per-electron: Mψ(r) ≡ Mp(r) + Mϕ(r) = Ip(r) − Iϕ(r). Indeed, the presence of a finite probability current j(r) ≠ 0, generated by the local phase ϕ(r) > 0, implies an additional “structure” element of the current distribution in the electronic state, thus increasing its resultant information (“order”) density, Iϕ(r) > 0, and lowering the associated entropy (“disorder”) contribution: Mϕ(r) < 0.

The global entropy of probability distribution has also been generalized in the resultant “scalar” measure of the uncertainty content in the specified quantum state ψ [2]:

graphic file with name 894_2018_3699_Equ30_HTML.gif 29

It represents the expectation value of the Hermitian operator of the scalar measure of resultant global entropy,

graphic file with name 894_2018_3699_Equ31_HTML.gif 30

and combines the classical contribution S[p] ≥ 0 of Shannon with its nonclassical supplement S[ϕ] = −2〈ϕψ ≤ 0 reflecting the average phase in state |ψ〉: 〈ϕψ = ∫p(r) ϕ(r) dr ≥ 0.

To summarize, the modulus (probability) and phase (current) components of electronic states are both accounted for in the resultant measures of the gradient or global descriptors of the information/entropy content in generally complex wavefunctions of molecular QM. These overall descriptors combine the familiar classical functionals of the system probability density and their nonclassical supplements due to the state phase or its current density. Their densities-per-electron satisfy classical relations linking the gradient and global descriptors, appropriately generalized to cover a complex character of electronic states. The Hermitian operator I^r gives rise to the real expectation value of the state content of resultant determinicity information I[ψ], related to the average kinetic energy T[ψ], while the non-Hermitian entropy operator S^ψr generates the complex average measure H[ψ] of the global uncertainty in ψ. The classical and nonclassical densities-per-electron of the resultant gradient information and the overall global entropy then separately obey the classical relations:

Ipr=Hpr2andIϕr=Hϕr2=iSϕr2=Sϕr2. 31

The squared gradient of the classical and nonclassical components in the Shannon-type entropy density is thus seen to determine densities of the associated contributions to the resultant Fisher-type information:

I^r=S^ψrS^ψr=S^ψr2=lnpr2+2ϕr2=prpr2+4ϕr20. 32

Therefore, the gradient of complex entropy can be regarded as the quantum amplitude of the resultant information content. In other words, S^ψr appears as the “square root” of I^r. This development is thus in spirit of the quadratic approach of Prigogine [1].

The (Hermitian) operator of the gradient entropy is seen to involve the sum of the ordinary squares of gradients of the operator components:

M^ψr=S^pr2+S^ϕr2=lnpr24ϕr2. 33

This relation establishes the scalar information principle for determining the phase-equilibria [2, 1420],

ψeq.r=Rrexpiϕr+ϕeq.rRrexpiΦeq.r, 34

corresponding to “thermodynamic” phase shift ϕeq. (r) ≥ 0. More specifically, the extremum of M[ψ] with respect to ψ*, 〈δψ|M^ψ|ψ〉 = 0, gives the Euler equation

lnpr2=4ϕeq.r2, 35

which identifies the equilibrium phase

ϕeq.r=1/2lnpr0. 36

The same optimum solution follows from the extremum rule for S[ψ], 〈δψ|Inline graphic|ψ〉 = 0, or

graphic file with name 894_2018_3699_Equ38_HTML.gif 37

It should be observed, however, that the associated extremum principle for the resultant gradient information I[ψ], 〈δψ|I^|ψ〉 = 0, predicts a pure-imaginary optimum phase ϕopt.(r) = iϕeq.(r).

It is also of interest to examine how the familiar Heisenberg (“indeterminicity”) inequality for the product of squared dispersions in the particle position r and momentum p,

δrψ2δpψ2>ħ2/4,δxψ2=ψx^2ψψx^ψ2;x=r,p,

translates into the probability (modulus) and current (phase) components of a general wavefunction of Eq. (1). The explicit form of the momentum operator in position representation, p^r= −iħ, gives the following expression for the momentum factor,

δpψ2=ħ2δϕψ2+14Ip,

involving the classical Fisher information I[p]. Here, [〈δϕψ]2 = 〈(ϕ)2ψ − [〈ϕψ]2 denotes the squared dispersion in the phase gradient,

δϕψ2=pϕ2drpϕdr2=14Iϕm/ħjdr2=m/ħ2V2ψVψ2=m/ħ2δVψ2,

related to the average velocity, 〈Vψ = ∫pVdr = ∫jdr, and the state nonclassical gradient information I[ϕ]. The squared dispersion of the particle momentum can thus be expressed in terms of the state result information content I[ψ],

δpψ2=ħ214Iψm/ħVψ2.

The Heisenberg uncertainty relation then reads:

δrψ2Iψ2m/ħVψ2>1.

Entanglement entropy of molecular fragments

Consider a division of the electron density ρ(r) in a molecular system M = A—B containing N = ∫ρ(r)dr electrons in the fragment distributions ρ(r) = {ρX(r)} corresponding to the complementary subsystems A and B:

ρ=ρA+ρB,ρXdr=NX,NA+NB=N. 38

For example, these fragments can represent atoms-in-molecules (AIM) or their collections, functional groups, reactants, etc. This partition also applies to the associated division of the probability (shape factor) distribution p(r) = ρ(r)/N, ∫p(r)dr = 1,

p=ρA/N+ρB/NπA+πB=NA/NρA/NA+NB/NρB/NBPApA+PBpB. 39

Here the vectors π(r) = {πX(r)} and p(r) = {pX(r)} combine the fragment probability densities, unity normalized within the whole molecule and in individual subsystems, respectively,

XπXrdr=XPX=pXrdr=1,X=A,B, 40

while P = {PX = ∫πX(r)dr = NX/N} contains the condensed probabilities of these constituent parts of M: PA + PB = 1.

These overall and subsystem probabilities generate the classical Shannon entropies reflecting the corresponding classical uncertainty descriptors. The global entropy of the molecule as a whole also defines the total entropy in division p,

Sp=prlnprdrStotalp0, 41

while the component probabilities define the indeterminicity measures of the probability density in individual fragments:

SpX=pXrlnpXrdrSX,X=A,B. 42

Together they determine the additive entropy for this partitioning,

Sadd.p=PASA+PBSB0, 43

and hence the associated nonadditive part of Stotal[p] = S[p]:

Snadd.p=StotalpSadd.p. 44

The latter can also be expressed as the weighted average of entropy deficiencies [10,11] in the fragment probability densities relative to the molecular distribution,

ΔSpXp=pXrlnpXr/prdr0, 45

measuring the corresponding information distances:

Snadd.p=PAΔSpAp+PBΔSpBp0. 46

The overall entropy in molecular electron density,

Sρ=ρrlnρrdr=XρXrlnρrdrStotalρ, 47

and its additive contribution,

Sadd.ρ=XρXrlnρXrdr, 48

give the associated nonadditive component:

Snadd.ρ=StotalρSadd.ρ=XΔSρXρΔSadd.ρρ. 49

This nonadditive entropy thus measures the additive (molecularly-referenced) entropy deficiency in electron densities. It reflects the average information distance between the fragment and molecular densities. It can be used to describe the information similarity between constituent parts and the whole system: the smaller this missing information, the more the two fragments resemble the molecule [3, 3436].

The IT descriptors of Eqs. (46) and (49) can be regarded as measures of the “binding” entropy in the mutually-open (entangled) subsystems for the specified molecular state Ψ yielding ρ, denoted as Ψ→ρ, in the bonded (molecular) composite system

MopenΨ=AΨ¦BΨA¦BΨ=M, 50

since the additive entropies of Eqs. (43) and (48) characterize the mutually-closed (disentangled) subsystems in the nonbonded (“promolecular”) reference [27, 29, 31]

Mclosed=A+ρB+ρA+B+. 51

Above the mutual “bonding” and “non-bonding” character of the two fragments has been denoted by the vertical broken- and solid-lines, respectively, which separate these subsystems in the corresponding composite system.

One further observes that the molecularly-referenced additive information-distance of Eq. (49) supplemented by the local constraint of Eq. (38), of conserving the molecular electron density in the partition, gives the variational similarity criterion

δΔSadd.ρρXλrρXrdr=0, 52

establishing the equal division of ρ between the two subsystems: ρX = ρ/2, X = A, B. It has been shown elsewhere [3, 3436], however, that the information variational rule in terms of the nonadditive entropy-deficiency referenced to the densities ρ0 = {ρX0} of separate subsystems,

δΔSnadd.ρρ0Xλ0rρX0rdr=0, 53

generates the Hirshfeld [37] (“stockholder”) pieces of the molecular density:

ρXH=ρX0ρ/ρ0ρX0wρdX0ordX0ρX0/ρ0=ρXH/ρdXH, 54

The optimum pieces of the molecular density can thus be regarded either as the local molecular enhancement w(r) = ρ(r)/ρ0(r) of the subsystem density ρX0, or as the promolecular share dX0(r) = ρX0(r)/ρ0(r) in the molecular density ρ(r). The promolecular distribution ρ0(r) = ∑X ρX0(r) is determined by the separate-fragment densities shifted to their actual positions in the molecule [3, 37]. Here,

ΔSnadd.ρρ0=ΔStotalρρ0ΔSadd.ρρ0=XρXrlndX0r/dXrdr=XρXrlnwr/wXrdr,wXr=ρXr/ρX0r,ΔStotalρρ0=ΔSρρ0,ΔSadd.ρρ0=XΔSρXρX0. 55

The “stockholder” fragments thus exhibit the maximum information similarity to their isolated (promolecular) analogs, giving rise to the minimum of the relevant entropy-deficiency (missing-information) descriptors [3, 3436]. A reference to Eq. (54) indicates that for this particular division scheme the nonadditive missing information of Eq. (39) exactly vanishes [3]:

ΔSnadd.ρHρ0=0, 56

since wXH = w and dXH = dX0.

One further observes that expressing ρX(r) in ΔSnadd.[ρ|ρ0] as ρ(r) dX(r) gives

ΔSnadd.ρρ0=ρrXdXrlndXr/dX0rdr=ρrΔSadd.drd0rdr0, 57

since both the local density ρ(r) and local additive information-distance ΔSadd.[d(r)|d0(r)] are separately nonnegative. Therefore, the Hirshfeld subsystems also result from the maximum information principle for the nonadditive entropy deficiency:

maxρΔSnadd.ρρ0=ΔSnadd.ρHρ0=0. 58

The stockholder pieces of the molecular density thus exhibit the maximum nonadditivity relative to the promolecular distributions in the separate subsystems.

Let us now examine the entanglement entropy of molecular fragments in the phase-equilibrium state of Eq. (34), which maximizes the resultant entropy combining contributions from the modulus (probability) and phase (current) components [1420]. Consider the single-electron (orbital) state ψ[p] = R exp(iϕ), ϕ ≥ 0, with p = |ψ|2 = R2 = ρ. The overall global entropy contains the negative nonclassical (phase) contribution S[ϕ] proportional to the state average phase 〈ϕ〉. The equilibrium, phase-transformed state in Mopen[ψp],

ψeq.p=expiϕeq.pψp=RexpiΦeq.p, 59

exhibits the resultant local phase

Φeq.p=ϕψp+ϕeq.pϕp+ϕeq.p, 60

identified by the optimum “thermodynamic” phase-shift of Eq. (36) related to the system probability density:

ϕeq.pr=1/2lnprϕMr. 61

In such an entangled state of subsystems, in the bonded (molecular) reference system Meq. = (A*¦B*)eq., the resultant phase Φeq.[p] also characterizes each mutually-open (nonadditive) fragment X* related to a common molecular “ancestor” state ψ[p]:

ϕeq.p=ϕeq.Xp,X=A,B. 62

The phase transformation of Eq. (59) generates the extra current contribution proportional to the probability gradient:

jϕeq.p=ħ/mpϕeq.p=ħ/2mpjeq.p. 63

The equilibrium states of the mutually-closed (additive) subsystems {Xeq.+} in Meq.+= (Aeq.+|Beq.+) are similarly described by the respective “thermodynamic” phase-shifts marking their own (internal) equilibria:

ϕeq.pXr=1/2lnpXrϕX+pX,X=A,B. 64

They generate the associated currents in molecular fragments,

jϕX+pX=ħ/mpXϕX+pX=ħ/2mpXjX+, 65

and the resultant average current:

jA+B+=XPXjX+=ħ/2mXPXpX=ħ/2mpj+. 66

In such a fragment resolution the overall entropy of Meq.=Meq. in the phase-equilibrium state ψeq.[p] thus reads:

Stotalψeq.p=S[Meq.*]=Sψeq.p=Sψp+Sϕeq.M=(Sp+Sϕψp)+SϕMp=Sϕψp, 67

since S[p] and

SϕMp=2pϕMpdr=Sp 68

cancel each other. One also observes that the additive equilibrium component, describing the mutually nonbonded fragments in Meq.+, exactly vanishes:

Sadd.ψeq.=S[Meq.+]=XpXlnpX+2ϕX+pXdr=0. 69

Therefore, the nonadditive part of the resultant entropy in the phase-transformed, equilibrium state ψeq.[p] is determined by the phase-entropy S[ϕ[p]], due to the phase of the original quantum state ψ[p], a common molecular “ancestor” of both the entangled subsystems in M:

Snadd.ψeq.=Stotalψeq.Sadd.ψeq.=Sϕ. 70

Indeed, the phenomenon of a quantum entanglement [28, 30, 31] for the given electron distribution in the molecule as a whole, has an exclusively nonclassical (phase/current) origin. This entropic descriptor of the fragment entanglement vanishes for the nondegenerate ground state ψ = ψ0, when the probability “degree-of-freedom” alone exactly identifies the molecular electronic state: ϕ = ϕ0 = 0, and hence j = j0 = 0. This result is in accordance with the basic theorems of DFT [22], which predict that all physical properties in such a “classical” (real) state are determined by the molecular electron density alone. Therefore, when the molecular current exactly vanishes, a division of the molecular density, a “static” distribution of electrons, into the equilibrium fragment pieces amounts to a classical partition into a collection of disentangled subsystems, which is devoid of any phase (coherence) content.

Affinities, fluxes, information production, and equilibrium

We now reexamine a related problem of the source term in the associated continuity equation for the resultant gradient information, generated by the coupled probability p(r, t) = ψ(r, t) ψ*(r, t) = R(r, t)2 and phase ϕ(r, t) = (2i)−1ln[ψ(r, t)/ψ*(r, t)] components of the molecular electronic state ψ(r, t). It can be approached using the standard treatment of irreversible thermodymamics [38].

Before addressing the problem of a production of the resultant information let us briefly examine the continuity of the classical gradient information I[p] of Eq. (18). Its functional derivative

Fpclass.r=δIpδpr=prpr22Δprpr=4ΔRrRr, 71

defines the local probability “intensity” of the classical information, which determines the functional differential, dI[p] = ∫ Fpclass.(r) δp(r) dr, the Fisher information current, JIclass.(r) = Fpclass.(r) j(r), its divergence:

JIclass.r=Fpclass.rjr+Fpclass.rjr, 72

and derivative of the functional density:

graphic file with name 894_2018_3699_Equ74_HTML.gif 73

Taking into account the probability continuity then gives the information source derivative:

graphic file with name 894_2018_3699_Equ75_HTML.gif 74

This product of the classical probability “affinity” Gpclass.(r) and “flux” j(r) is thus seen to identically vanish for an r-independent phase, i.e., the zero current, e.g., in the stationary state of Eq. (15). Turning now to the continuity problem of the resultant gradient information, one again recognizes the continuity relations of Eqs. (8) and (12) for the independent (instantaneous) probability and phase parameters of a general, complex wavefunction of Eq. (6),

σp=dpdt=pt+j=0andσϕ=dt=ϕt+J0, 75

expressing the associated dynamical equations resulting from SE:

p/t=ħ/mpϕ+pΔϕ]=Vtp=ħ/mϕψtpandϕ/t=ħ/2mR1ΔRϕ2v/ħ, 76

here jJp = pV denotes the probability current and JJϕ = ϕV stands for the phase-flux density, measuring the phase transported through unit area per unit time, and the phase source σϕ is defined in Eq. (14). The local resultant intensities (per unit volume) F(r) = [Fp(r), Fϕ(r)] ≡ {Fk(r)}, associated with the probability and phase components x(r) = [p(r), ϕ(r)]} ≡ {xk(r)} and their currents J(r) = [Jp(r), Jϕ(r)]} ≡ {Jk(r)}, are again given by the corresponding (partial) functional derivatives of I[ψ] = I[p, ϕ] with respect to these state parameters:

Fpr=Ipϕprϕ=prpr22Δprpr+4ϕr2=4ϕr2ΔRrRr,Fϕr=Ipϕϕrp=8prϕr+prΔϕr. 77

They determine the differential of this resultant gradient information,

graphic file with name 894_2018_3699_Equ79_HTML.gif 78

and suggest the associated information current

JIr=FprJpr+FϕrJϕrkFkrjkr 79

measuring the regional resultant information transported through unit area per unit time.

The rate of a local production of the resultant gradient information is given by the sum of the information leaving the region and the rate of the information source within this infinitesimal volume,

graphic file with name 894_2018_3699_Equ81_HTML.gif 80

where:

graphic file with name 894_2018_3699_Equ82_HTML.gif 81

Finally, using the continuity Eq. (75) identifies the source term of the resultant gradient information:

graphic file with name 894_2018_3699_Equ83_HTML.gif 82

The first term in the preceding equation is classical in character. As in irreversible thermodynamics [38] it combines products of regional affinities G(r) ≡ {Gk(r) = Fk(r)}, gradients of local intensities F(r) ≡ {Fk(r)}, and fluxes J(r) ≡ {Jk(r)} associated with the state parameters x(r) ≡ {xk(r)}. The former determine the information “forces” driving these conjugate flows,

Gk=σI/Jk, 83

while the latter appear as information “responses” to these generalized perturbations:

Jk=σI/Gk. 84

Due to the nonclassical (phase) contribution in the overall information measure, the rate of production of the resultant gradient information does not vanish for zero affinities:

σIrG=0=Fϕrσϕr. 85

However, the equilibrium source of the gradient information vanishes for the zero phase intensity, Fϕ(r) = 0, when ϕ(r) = 0, e.g., in the stationary state of Eqs. (15) and (16), for p[ψs] = ps(r) and ϕ[ψs] = ϕs(t), when ϕs(t) = 0 and Δϕs(t) = 0. In such eigenstates of the Hamiltonian operator one finds: {Jp[ψs] = 0, Jϕ[ψs] = Jϕ (t)}, {Fp[ψs] = Fpclass., Fϕ[ψs] = 0} and {Gp[ψs] = Gpclass., Gϕ[ψs] = 0}. The stationary state, corresponding to the sharply specified electronic energy of Eq. (16), thus exhibits a nonvanishing probability-intensity of Eq. (71),

Fpψs=Rs/Rs=8mEsv/ħ2orGpψs=8m/ħ2v0, 86

and zero values of the phase-intensity and affinity: Fϕ[ψs] = 0, Gϕ[ψs] = 0.

The vanishing production of the resultant gradient information in the stationary quantum states,

σIψs=GpψsJpψs+GϕψsJϕψs+Fϕψsσϕψs=0, 87

identifies them as the system information equilibra. Thus, the stationary wavefunctions of molecular QM represent the zero production states of the overall gradient information. It should be observed, however, that contrary to the concept of equilibrium in irreversible thermodynamics [38], such IT equilibrium states do not correspond to the vanishing affinities G = 0, since σI[ψs] vanishes due to Jp[ψs] = Gϕ[ψs] = 0 and Fϕ[ψs] = 0. A given displacement from this information equilibrium, specified by the applied “forces” G, triggers probability and phase flows with each flux depending on all affinities and all intensities: Jk = Jk(G, F). One recalls that in ordinary thermodynamics [38] each flux depends most strongly on its own affinity and it is also known to vanish as these affinities vanish, so one can expand the currents in powers of the affinities with no constant term.

The intensity and affinity concepts corresponding to the phase equilibrium of Eq. (61) are also of interest:

Fϕeq.=8pϕeq.=pandGϕeq.=43p. 88

The former is thus related to the Fick’s diffusion equation [28] (see Eq. (63)]:

p/tdiff.=jeq.p=ħ/2mΔpDΔp, 89

which formally identifies the electron diffusion coefficient D = ħ/(2 m) and the associated current for this migration:

jdiff.=Dp=jeq.p. 90

Dynamics of resultant entropy/information descriptors

Let us now reexamine a temporal evolution of the overall measures of the information and entropy content in the specified molecular quantum state |ψ(t)〉. One recalls that the average energy E[ψ(t)] of an isolated molecular system,

Eψt=Eψt=ψtH^ψt=Tψt+VψtEt,Tψt=ψtT^ψtTψtTt,Vψt=ψtvψtVψt=prvrdrVt, 91

remains conserved in time:

dEt/dt=i/ħψtH^H^ψt=0. 92

One similarly explores time dependency of overall measures of the complementary entropy/information descriptors: the expectation values of the state complex entropy,

Hψt=Hψt=ψtS^tψt=prt2lnψrtdr=Spt+iSϕtHt, 93

and its resultant gradient information:

Iψt=Iψt=ψtI^ψt=Ipt+Iϕt=Ipt+IjtIt. 94

One first observes that a direct differentiation of the complex-entropy functional H(t) = ∫ψ*(r, t) Sψ(r, t)ψ(r, t) dr = ∫p(r, t) Sψ(r, t) dr gives:

σHtdHt/dt=Sψrtprt/tdr+prtSψrt/tdr=2lnψrtjrtdr2ψrtψrt/tdr=lnprt+2rtjrtdr+2i/ħψrtH^rψrtdr=ħ/mlnprt+2rtprtϕrt+prtΔϕrtdr+2iωt. 95

where ω(t) = E(t)/ħ. Therefore, this complex derivative exhibits the following real and imaginary components:

RedHpϕ/dt=dSp/dt=(lnp)jdr,ImdHpϕ/dt=dSϕ/dt=2ϕjdr+ωt. 96

A reference to the real part dS[p]/dt directly shows that it has a nonclassical (current) genesis, vanishing in the classical limit, for ϕ = 0 or j = 0, when S[ϕ] = 0. Indeed it is the probability current j that drives changes in the probability distribution [see Eq. (5b)]. Clearly, the imaginary part of the derivative results alone from the nonclassical phase component S[ϕ] of the complex entropy. One thus finally concludes that the time evolution of the entire complex entropy exhibits the phase/current origin.

This result can also be demonstrated via the differentiation of the expectation value H(t) = 〈ψ(t)|S^t|ψ(t)〉 and by subsequently using SE and probability continuity:

graphic file with name 894_2018_3699_Equ98_HTML.gif 97

where H^S^ = T^S^ = (ħ2/m) [ 2, lnψ] = (ħ2/m){ ⋅ [, lnψ] + [, lnψ] ⋅ } and

lnψ=∇lnψ=lnR+iϕ=12∇lnp+i∇ϕ=R1R+i∇ϕ. 98

In the Schrödinger dynamical picture, the time change of the resultant gradient information, the operator of which does not depend on time explicitly, I^r= −42 = (8 m/ħ2)T^r, results solely from the time dependence of the system state vector itself. Therefore, the time derivative of the average Fisher-type gradient (determinicity) information is generated by the expectation value of the commutator H^I^ alone,

dIt/dt=i/ħψtH^I^ψti/ħH^I^ψt, 99
H^I^=vI^=42v=4v+v,v=v, 100

and the integration by parts implies 〈ψ|ψ〉 = − 〈ψ|ψ〉 ≡ 〈ψ|ψ〉 or = −. Hence, the time derivative of the overall gradient information reads:

σItdIt/dt=4i/ħψvψψvψ=8/ħImψtvψt=8/ħImψrtvrψrtdr=8/ħpϕvdr=8m/ħ2jvdr. 101

Again, this total time-derivative of the resultant gradient information is seen to be determined by the current content of the molecular electronic state. Therefore, it identically vanishes for the zero current density everywhere for ϕ(r) = 0, thus again confirming its nonclassical origin.

Conclusions

The IT approach has proven its utility in a variety of molecular scenarios, e.g., [3942]. In this analysis we examined mutual relations between densities of the classical and nonclassical components of the resultant information/entropy measures, combining the probability contributions of Fisher or Shannon and their associated phase/current supplements. For example, the complex (“vector”) entropy approach combines the classical (real) and nonclassical (imaginary) contributions due to the state probability (wavefunction modulus) and current (wavefunction phase), respectively. Such generalized entropic concepts allow one to distinguish the information content of states generating the same electron density but differing in their phase/current composition. They also allow a more precise information-theoretic description of the bonding status of molecular fragments. The IT principles using the resultant quantum descriptors of the entropy/information content in electronic states have also been used to determine the phase and information equilibria in molecules and their constituent parts [1218]. The phase aspect of molecular states is also vital for the quantum (amplitude) communications between atoms in molecules [25, 42], which determine entropic descriptors of the chemical bond multiplicities and their covalent/ionic composition.

The need for the nonclassical (phase/current) supplements of the classical (probability) measures of the information content in molecular states has been stressed. The electron density distribution determines a static facet of the molecular structure, while the current distribution describes its dynamic aspect. Both these structural manifestations contribute to the overall information content of the generally complex electronic states of molecular systems, reflected by resultant IT concepts. The total time derivatives of such entropic descriptors of electronic states have been examined. These time dependencies have been established via the Schrödinger equation and the dynamics/continuity it implies for the classical and nonclassical degrees-of-freedom of complex wavefunctions. The nonclassical origin of the net temporal changes in the overall entropy/information quantities have been demonstrated. Thus, in real electronic states, exhibiting the vanishing local phase and current, the time derivatives of the resultant gradient information and global entropy exactly vanish.

Although, for simplicity reasons, we have assumed the one-electron case, the modulus (density), and the phase (current) aspects of general electronic states can be similarly separated [2] using the Harriman-Zumbach-Maschke (HZM) construction [43, 44] of Slater determinants yielding the specified electron density. The present single-electron development can then be naturally generalized into many-electron states of atomic and molecular systems. One observes that in HZM construction the common modulus part of N occupied (orthonormal) equidensity orbitals (EO), for the given ground-state density ρ, ψ[ρ] = {ψw[p]}, reflects the molecular probability distribution p(r) = ρ(r)/N, and so do the system optimum (orthonormal) Kohn-Sham (KS) [23] orbitals: φ[ρ] = {φt[p]}. In fact the two sets constitute the equivalent sets of spin-orbitals linked by the unitary transformation and generating identical Slater determinants, Ψ(N) ≡ det(ψ[ρ]) = det(φ[ρ]) ≡ Φ(N), and hence the invariant overall entropy or information measures.

For example, the expectation value of the N-electron operator for the overall gradient information, Inline graphic = ∑iI^i, {I^i= −4Δi}, reads:

graphic file with name 894_2018_3699_Equ103_HTML.gif 102

where Ts(N) stands for the kinetic energy of noninteracting electrons in the KS limit [23]. Thus, the amount of resultant gradient information in the occupied HZM orbitals derived from the optimum molecular distribution of electrons equals to that contained in orbitals describing the separable KS system corresponding to the same ground-state density. Notice, however, that the proportions between the classical (probability) and nonclassical (phase) information contributions vary with different partitions of the molecular density into orbital components.

In the DFT-based theory of chemical reactivity one distinguishes between several hypothetical stages involving either the mutually bonded (entangled) or nonbonded (disentangled) states of reactants for the same electron distribution in constituent subsystems. These two categories are discerned by the phase aspect of the quantum entanglement between such molecular fragments, e.g., [2, 27]. We have identified the classical entropic descriptor of this phenomenon, the nonadditive global entropy, which has been interpreted as the partition additive entropy-deficiency measuring the average information-distance between the fragment and molecular densities. The equilibrium phases and currents of reactants can be related to the relevant electron densities using the entropic principles of the quantum IT. This generalized approach deepens our understanding of the molecular/promolecular promotions of the constituent molecular fragments and provides a more precise framework for describing the hypothetical stages invoked in the theory of the chemical bond and reactivity.

The phenomenological apparatus of irreversible thermodynamics [38] also provides an attractive basis for an entropic representation of elementary molecular processes [2]. In this analysis we approached anew the problem of productions of the overall measures of information/entropy, which take into account both the modulus and phase components of complex wavefunctions. We introduced the relevant intensity and affinity conjugates of both the probability and phase fluxes, which together define a local production of the state information content. The nonvanishing local phase source has been identified, giving rise to the nonclassical contribution in the local production of the resultant entropy/information content. It has been argued that the criterion of the vanishing production of the gradient information identifies the stationary states of molecular QM as the system information equilibria. The local information-source has also been interpreted “thermodynamically”, by separating a classical summation over products of affinities (“perturbations”) and fluxes (“responses”) associated with the probability and phase/current degrees-of-freedom of molecular states. Since spontaneous flows driven by displacements in the given information affinity should act in a direction to restore the equilibrium, these elementary products should be negative; thus, decreasing the state information (determinicity) level, and hence increasing the state entropy (uncertainty, indeterminicity) content. This suggests a positive entropy production, and hence a negative information source.

To conclude this analysis, let us briefly comment on the resultant entropy concepts containing an explicit phase contribution [Eqs. (24) and (29)], the Shannon entropy of electron probability distribution [Eq. (19)], and von Neumann’s (vN) ensemble average entropy [45] contained in the density operator. The latter is defined as mathematical trace involving the density operator ρ^ of the statistical mixture in question,

SvNρ^=tr(ρ^lnρ^) 103

expressed in terms of its eigenvectors {|ψj〉} and eigenvalues (probabilities) {|ηj〉}:

ρ^ψi=ηiψi,ρ^=jψjηjψj. 104

It generates the information entropy contained in the ensemble state probabilities:

SvNρ^=jηjlnηj. 105

This measure identically vanishes for the pure quantum state |ψ〉, when ρ^ψ = |ψ〉 〈ψ| and ηψ = 1: SvN[ρ^ψ] = 0. The (idempotent) density operator ρ^ψ then determines the Hermitian density matrix in position representation,

γrr=rρ^ψr=ψrψr,γrr=pr, 106

in terms of which the Shannon entropy S[p] contained in the probability density p(r) reads:

Sp=Sρ^ψ=drdrγrrδrrlnγrrtrρ^ψS^γ. 107

Indeed, in this pure-state case the above “ensemble” average measure reduces to the expectation value in state |ψ〉, S[p] = 〈ψ|S^ψclass.|ψ〉, of the classical (Hermitian) entropy operator [see Eq. (22)],

S^ψclass.=12S^ψ+S^ψ=ReS^ψ=lnp. 108

Therefore, in the familiar Shannon entropy of classical IT, which reconstructs the ensemble-average measure of von Neuman’s quantum entropy in density matrix, the phase/current information terms of the complex entropies S[ψ] and S[ψ*] = S[ψ]* cancel out, as indeed expected of the expectation value of the Hermitian operator S^ψclass.. One recalls that in QM one represents the physical properties by the associated (linear) Hermitian operators. However, the information entropy is neither an observable, determined in an experiment, nor is it linear in the underlying probability argument. Therefore, attributing to the overall quantum entropy content in the specified quantum state a non-Hermitian operator is an admissible, workable proposition, capable of a unique phase characterization of the entangled molecular subsystems [31].

Footnotes

This paper belongs to Topical Collection International Conference on Systems and Processes in Physics, Chemistry and Biology (ICSPPCB-2018) in honor of Professor Pratim K. Chattaraj on his sixtieth birthday

The following notation is adopted: A denotes a scalar, A is the row or column vector, A represents a square or rectangular matrix, and the dashed symbol  stands for the quantum-mechanical operator of the physical property A. The logarithm of the Shannon information measure is taken to an arbitrary but fixed base: log = log2 corresponds to the information content measured in bits (binary digits), while log = ln expresses the amount of information in nats (natural units): 1 nat = 1.44 bits.

References

  • 1.Prigogine I. From being to becoming: time and complexity in the physical sciences. San Francisco: Freeman; 1980. [Google Scholar]
  • 2.Nalewajski RF. Quantum information theory of molecular states. New York: Nova; 2016. [Google Scholar]
  • 3.Nalewajski RF. Information theory of molecular systems. Amsterdam: Elsevier; 2006. [Google Scholar]
  • 4.Nalewajski RF. Information origins of the chemical bond. New York: Nova; 2010. [Google Scholar]
  • 5.Nalewajski RF. Perspectives in electronic structure theory. Heidelberg: Springer; 2012. [Google Scholar]
  • 6.Fisher RA. Theory of statistical estimation. Proc Cambridge Phil Soc. 1925;22:700–725. doi: 10.1017/S0305004100009580. [DOI] [Google Scholar]
  • 7.Frieden BR. Physics from the Fisher information – a unification. Cambridge: Cambridge University Press; 2004. [Google Scholar]
  • 8.Shannon CE. The mathematical theory of communication. Bell System Tech J. 1948;27:379–493. doi: 10.1002/j.1538-7305.1948.tb01338.x. [DOI] [Google Scholar]
  • 9.Shannon CE, Weaver W. The mathematical theory of communication. Urbana: University of Illinois; 1949. [Google Scholar]
  • 10.Kullback S, Leibler RA. On information and sufficiency. Ann. Math. Stat. 1951;22:79–86. doi: 10.1214/aoms/1177729694. [DOI] [Google Scholar]
  • 11.Kullback S. Information theory and statistics. New York: Wiley; 1959. [Google Scholar]
  • 12.Abramson N. Information theory and coding. New York: McGraw-Hill; 1963. [Google Scholar]
  • 13.Pfeifer PE. Concepts of probability theory. New York: Dover; 1978. [Google Scholar]
  • 14.Nalewajski RF. Exploring molecular equilibria using quantum information measures. Ann Phys. 2013;525:256–268. doi: 10.1002/andp.201200230. [DOI] [Google Scholar]
  • 15.Nalewajski RF. On phase equilibria in molecules. J. Math. Chem. 2014;52:588–612. doi: 10.1007/s10910-013-0280-2. [DOI] [Google Scholar]
  • 16.Nalewajski RF. Quantum information approach to electronic equilibria: molecular fragments and elements of non-equilibrium thermodynamic description. J. Math. Chem. 2014;52:1921–1948. doi: 10.1007/s10910-014-0357-6. [DOI] [Google Scholar]
  • 17.Nalewajski RF. On phase/current components of entropy/information descriptors of molecular states. Mol. Phys. 2014;112:2587–2601. doi: 10.1080/00268976.2014.897394. [DOI] [Google Scholar]
  • 18.Nalewajski RF. On entropy/information continuity in molecular electronic states. Mol. Phys. 2016;114:1225–1235. doi: 10.1080/00268976.2015.1093182. [DOI] [Google Scholar]
  • 19.Nalewajski RF. Phase/current information descriptors and equilibrium states in molecules. Int. J. Quantum Chem. 2015;115:1274–1288. doi: 10.1002/qua.24750. [DOI] [Google Scholar]
  • 20.Nalewajski RF. Quantum information measures and molecular phase equilibria. In: Baswell AR, editor. Advances in mathematics research vol 19. New York: Nova; 2015. pp. 53–86. [Google Scholar]
  • 21.Nalewajski RF. Quantum information descriptors in position and momentum spaces. J. Math. Chem. 2015;53:1549–1575. doi: 10.1007/s10910-015-0505-7. [DOI] [Google Scholar]
  • 22.Hohenberg P, Kohn W. Inhomogeneous electron gas. Phys. Rev. 1964;136B:864–971. doi: 10.1103/PhysRev.136.B864. [DOI] [Google Scholar]
  • 23.Kohn W, Sham LJ. Self-consistent equations including exchange and correlation effects. Phys. Rev. 1965;140A:1133–1138. doi: 10.1103/PhysRev.140.A1133. [DOI] [Google Scholar]
  • 24.Levy M. Universal variational functionals of electron densities, first-order density matrices, and natural spin-orbitals and solution of the v-representability problem. Proc Natl Acad Sci U S A. 1979;76:6062–6065. doi: 10.1073/pnas.76.12.6062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Nalewajski RF. Complex entropy and resultant information measures. J. Math. Chem. 2016;54:1777–1782. doi: 10.1007/s10910-016-0651-6. [DOI] [Google Scholar]
  • 26.Nalewajski RF. On entropy-continuity descriptors in molecular equilibrium states. J. Math. Chem. 2016;54:932–954. doi: 10.1007/s10910-016-0595-x. [DOI] [Google Scholar]
  • 27.Nalewajski RF. Phase description of reactive systems. In: Islam N, Kaya S, editors. Conceptual density functional theory and its application in the chemical domain. Waretown: Apple Academic; 2018. pp. 217–249. [Google Scholar]
  • 28.Nalewajski RF. Entropy continuity, electron diffusion and fragment entanglement in equilibrium states. In: Baswell AR, editor. Advances in mathematics research. New York: Nova; 2017. pp. 1–42. [Google Scholar]
  • 29.Nalewajski RF (2017) Chemical reactivity description in density-functional and information theories. In: Liu S (ed) Chemical concepts from density functional theory. Acta Phys-Chim Sin 33:2491–2509
  • 30.Primas H. Chemistry, quantum mechanics and reductionism. Berlin: Springer; 1981. [Google Scholar]
  • 31.Nalewajski RF. On entangled states of molecular fragments. Trends Phys Chem. 2016;16:71–85. [Google Scholar]
  • 32.von Weizsäcker CF. Zur theorie der kernmassen. Z. Phys. 1935;96:431–458. doi: 10.1007/BF01337700. [DOI] [Google Scholar]
  • 33.Nalewajski RF. Use of fisher information in quantum chemistry. Int. J. Quantum Chem. 2008;108:2230–2252. doi: 10.1002/qua.21752. [DOI] [Google Scholar]
  • 34.Nalewajski RF, Parr RG. Information theory, atoms-in-molecules and molecular similarity. Proc. Natl. Acad. Sci. U. S. A. 2000;97:8879–8882. doi: 10.1073/pnas.97.16.8879. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Nalewajski RF. Hirschfeld analysis of molecular densities: subsystem probabilities and charge sensitivities. Phys. Chem. Chem. Phys. 2002;4:1710–1721. doi: 10.1039/b107158k. [DOI] [Google Scholar]
  • 36.Parr RG, Ayers PW, Nalewajski RF. What is an atom in a molecule? J. Phys. Chem. A. 2005;109:3957–3959. doi: 10.1021/jp0404596. [DOI] [PubMed] [Google Scholar]
  • 37.Hirshfeld FL. Bonded-atom fragments for describing molecular charge densities. Theoret Chim Acta (Berl) 1977;44:129–138. doi: 10.1007/BF00549096. [DOI] [Google Scholar]
  • 38.Callen HB. Thermodynamics: an introduction to the physical theories of equilibrium thermostatics and irreversible thermodynamics. New York: Wiley; 1960. [Google Scholar]
  • 39.Heidar-Zadeh F, Ayers PW, Verstraelen T, Vinogradov I, Vöhringer-Martinez E, Bultinck P. Information-theoretic approaches to atoms-in-molecules: Hirshfeld family of partitioning schemes. J. Phys. Chem. A. 2018;122:4219–4245. doi: 10.1021/acs.jpca.7b08966. [DOI] [PubMed] [Google Scholar]
  • 40.Liu S, Rong C, Lu T. Information conservation principle determines electrophilicity, nucleophilicity and regioselectivity. J. Phys. Chem. A. 2014;118:3698–3704. doi: 10.1021/jp5032702. [DOI] [PubMed] [Google Scholar]
  • 41.Zhou XY, Rong CY, Lu T, Zhou PP, Liu SB. Information functional theory: electronic properties as functionals of information for atoms and molecules. J. Phys. Chem. A. 2016;120:3634–3642. doi: 10.1021/acs.jpca.6b01197. [DOI] [PubMed] [Google Scholar]
  • 42.Nalewajski RF. Electron communications and chemical bonds. In: Wójcik M, Nakatsuji H, Kirtman B, Ozaki Y, editors. Frontiers of quantum chemistry. Singapore: Springer; 2017. pp. 315–351. [Google Scholar]
  • 43.Harriman JE. Orthonormal orbitals for the representation of an arbitrary density. Phys. Rev. A. 1981;24:680–682. doi: 10.1103/PhysRevA.24.680. [DOI] [Google Scholar]
  • 44.Zumbach G, Maschke K (1983) New approach to the calculation of density functionals. Phys Rev A 28:544–554 Erratum (1984) Phys Rev A 29:1585–1587
  • 45.Von Neumann J. Mathematical foundations of quantum mechanics. Prineton. Princeton: University Press; 1955. [Google Scholar]

Articles from Journal of Molecular Modeling are provided here courtesy of Springer

RESOURCES