Skip to main content
Entropy logoLink to Entropy
. 2018 May 23;20(6):394. doi: 10.3390/e20060394

State Entropy and Differentiation Phenomenon

Masanari Asano 1,*, Irina Basieva 2, Emmanuel M Pothos 2, Andrei Khrennikov 3,4
PMCID: PMC7512914  PMID: 33265484

Abstract

In the formalism of quantum theory, a state of a system is represented by a density operator. Mathematically, a density operator can be decomposed into a weighted sum of (projection) operators representing an ensemble of pure states (a state distribution), but such decomposition is not unique. Various pure states distributions are mathematically described by the same density operator. These distributions are categorized into classical ones obtained from the Schatten decomposition and other, non-classical, ones. In this paper, we define the quantity called the state entropy. It can be considered as a generalization of the von Neumann entropy evaluating the diversity of states constituting a distribution. Further, we apply the state entropy to the analysis of non-classical states created at the intermediate stages in the process of quantum measurement. To do this, we employ the model of differentiation, where a system experiences step by step state transitions under the influence of environmental factors. This approach can be used for modeling various natural and mental phenomena: cell’s differentiation, evolution of biological populations, and decision making.

Keywords: density operator, state entropy, von Neumann entropy, quantum measurement, differentiation

1. Introduction

In quantum theory, a state of a system is represented by a density operator. A density operator, e.g., ρ, can be decomposed into a weighted sum of (projection) operators representing “pure states”. This linear combination represents a statistical distribution of pure states in an ensemble of systems. However, the same density operator ρ can be decomposed in various ways. Hence, numerous statistical state distributions are mathematically encoded by the same ρ, unless ρ coincides with a pure state.

One class of these statistical distributions, namely, obtained from “Schatten decompositions” of ρ, plays a special role. We remark that, for a density operator with degenerate spectrum, Schatten decomposition is not unique. Any selection of orthogonal bases in eigensubspaces of ρ generates some Schatten decomposition. Each Schatten decomposition corresponds to the statistical distribution of eigenstates of ρ. The crucial point is that these eigenstates may be distinguishable on the basis of measurement of some physical quantity X, because these states are orthogonal to each other. The eigenvalues are interpreted as the frequency probabilities of the measurement outcomes. In this sense, the distribution corresponding to the concrete Schatten decomposition of the density operator ρ is conceptually equivalent to a “classical” or “standard” probability distribution.

On the other hand, other decompositions of the same state ρ are “non-classical” or “non-standard” and represent ensembles of pure states which may be not orthogonal to each other. In Section 2, we discuss these points in more detail.

The main topic of this paper is a quantity that evaluates structural features of various statistical state distributions encoded in the same density operator ρ. It is well-known that the von Neumann entropy [1,2], defined as ρlogρ, can evaluate how ρ deviates from a pure state, i.e., the degree of mixture of pure states. In fact, ρlogρ can be rewritten as kλklogλk, where {λk} are eigenvalues of ρ. It equals to zero if and only if ρ is a pure state. Note that the quantity kλklogλk is the Shannon entropy for classical probability distribution {λk}. Thus, the von Neumann entropy evaluates only the classical distribution encoded in ρ, but not non-classical ones.

In this paper, we define a quantity such that more detailed information about the structure of statistical state distributions, especially non-classical ones, is reflected. Our discussion is fundamental, but straightforward. First, in Section 3, we mention the “differentiation phenomenon” which an ensemble of pure states experiences under a quantum measurement of some physical observable, say X. Each pure state is stochastically differentiated into an eigenstate of X. If pure states in the statistical ensemble are different, the expectation values of X estimated from each of them are also generally different. In Section 4, we focus on dispersion of these expectation values and discuss its mathematical property reflecting structural features of the state distribution. Finally, in Section 5, we define a “state entropy” (see Equation (17)). This quantity evaluates the “diversity” of pure states constituting an ensemble. It is proportional to the number of pure states and inversely proportional to similarities among them.

We also point to the interrelation between the state entropy and the von Neumann entropy. It can be briefly described in the following way. If a state distribution, which is encoded in ρ, is classical, then its state entropy is equal to the von Neumann entropy. The state entropies of non-classical state distributions do not exceed the latter; see the inequality of Equation (18): The state entropy is a generalization of von Neumann entropy which is extensively used in different types of quantum entropies, e.g., conditional, relative and mutual entropies [3,4,5].

State entropy evaluates non-classical statistical state distributions. To stress significance of the notion of state entropy, we explain the theoretical context of state distributions. We note that classical state distributions are always identified after completion of quantum measurements. Therefore, non-classical distributions may exist at the stages before measurements are completed, more generally, in the process of differentiation.

In Section 6, we focus on the model of differentiation that was discussed in Reference [6]. This model describes accumulation of very small state transitions experienced by the system, and each transition is mathematically represented by a map in the state space, i.e., by a “quantum channel” in the terminology of quantum information theory. A quantum channel denoted by Λ is given by Equation (28), which is concerned with “environmental elements” around the system. They are weakly interacting with the system causing numerous small state transitions step by step, if differentiations of states occur sequentially. The above picture corresponds to an ideal “open quantum system dynamics”. To describe the process of differentiation in the system, we consider a more complicated model, assuming differentiations not only of the system state, but also in the elements of the environment. The differentiation in each environmental element is similar to the determination of a “pointer basis” in the theory of quantum decoherence proposed by Zurek [7]. In our approach, the Lindblad equation [8,9], which is a traditional way to describe open quantum system dynamics, is not employed directly.

We believe that the described model can be applicable to a variety of natural and mental phenomena (not only in the micro-world). The process of creation of a diversity of states in an ensemble of systems, which were originally prepared in the same pure state Ψ, through mutual interaction with environmental factors is universal. Originally, the formalism of quantum theory was established to describe microscopic phenomena, but now it is widely used in psychology, decision making, and finance (see [10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38]). It is also applied to model behavior of biological systems, especially the functioning of genetic and epigenetic systems (see [39,40,41,42,43,44,45,46]). We plan to explore the novel mathematical apparatus developed in this paper (based on the state entropy) for such applications elsewhere.

In psychology, there has been extensive interest in employing classical entropy for quantifying uncertainty, e.g., in decision making (entropy minimization was used to model decision biases in [47]), categorization (as a way to formalize intuitions in spontaneous grouping [48]), and learning [49,50]. We plan to apply the apparatus of the quantum state entropy to these problems.

As shown in Figure 1, the accumulation of transitions generated by channel Λ represents an ideal differentiation process realized in the system. Further, in this modeling, non-classical state distributions in the intermediate stages are identified (see Equations (29)–(31)). We analyze them by means of the state entropy (see Figure 2 and Figure 3).

Figure 1.

Figure 1

Histograms of population rates of states with l120<|ψ1|Ψ{i1,i2,,in}|2l20(l=1,2,,20) in the case of n=0,10,100,500 and 2000. The parameters are set by M=L=2, Ψ=0.7ψ1+0.3ψ2 (P1=0.7,P2=0.3), ν1|1=0.5(ν2|1=0.5) and ν1|2=0.45(ν2|2=0.55). If Ψ{i1,i2,,in}ψ1(ψ2), |ψ1|Ψ{i1,i2,,in}|2 takes a value nearby 1(0). With increasing n, the state distribution approaches to {{ψ1,ψ2},{0.7,0.3}}.

Figure 2.

Figure 2

Behaviors of state entropy, von Neumann entropy and log(Tr(ρ2)) at the parameters of M=L=2, Ψ=0.7ψ1+0.3ψ2 (P1=0.7,P2=0.3), ν1|1=0.5(ν2|1=0.5) and ν1|2=0.45(ν2|2=0.55).

Figure 3.

Figure 3

Difference between von Neumann entropy and state entropy.

2. State Representation by Density Operator

If a physical quantity X is measurable in a system, the frequency probabilities {P(x)} for observed values {x} may be estimated. Then, the quantity X is a “stochastic variable” in terms of probability theory, and the distribution {P(x)} is a “state of the system” which can be analyzed, e.g., by calculating the expectation value E(X) or dispersion V(X)=E(X2)(E(X))2, as is usual in statistics.

The mathematical framework of quantum theory includes probability theory, where classical concepts of stochastic variables and probability distribution are expanded using the notion of “operator”. Firstly, a physical quantity is defined in the form of

X=k=1Mxkxkxk. (1)

This is a Hermitian operator in Hilbert space H=CM with real eigenvalues xkR(k=1,2,,M) and eigenvectors {xk}. (A vector xH whose norm is 1 is called ket-vector, and x, which is Hermitian conjugate of x, i.e., x=x, is called bra-vector.) The form of Equation (1) implies that after a non-degenerate value xk is observed, the system under the measurement has the definite (pure) state represented by the operator xkxk. Note that the trace of product of X and xkxk is equal to xk;

Tr(Xxkxk)=xkXxk=xk.

For the calculation, the orthogonality of vectors, i.e., xk|xk=0 if kk, is used. Next, using the pure states {xkxk}, let us construct the operator:

ρ=K=1MP(xk)xkxk. (2)

where {P(xk)} corresponds to the frequency probabilities of the observed values {xk}, and, in fact, the trace of Xρ is equal to the expected value E(X);

Tr(Xρ)=E(X). (3)

Mathematically, ρ is a Hermitian matrix satisfying Tr(ρ)=1 and xρx0,xH=CM. Such operator is called density operator and used for representing a statistical mixture of pure states (a mixed state). A density operator may be given in the form of Schatten decomposition, i.e., represented as a diagonal matrix:

k=1Mλkϕkϕk, (4)

where {λk0} are the eigenvalues of the matrix (the same as probabilities {P(xk)} of ρ), and {ϕkH=CM} are the corresponding eigenvectors (the same as xk of ρ). From Equation (4), one can obtain a picture of statistical mixture of {ϕkϕk}. (This mixture is denoted by {ϕk,λk} hereafter.) As can be seen from the construction of ρ in Equation (2), to give a Schatten decomposition is conceptually equivalent to giving a probability distribution of measurement of some physical quantity. In this sense, the state distribution {ϕk,λk} is “classical”. We have to point out here that decomposition of density operator is not unique, generally: By considering various linear combinations of {ϕk}, one can find a set of vectors {Ψi,i=1,N}, which satisfies

k=1Mλkϕkϕk=i=1NPiΨiΨi,i=1NPi=1. (5)

Note that NM and the vectors {ΨiH=CM} need not be orthogonal to each other, that is, they need not be eigenstates of a single physical quantity: the state distribution {Ψi,Pi} is “non-classical”. There exist numerous state distributions corresponding to same density operator, other than {ϕk,λk} and {Ψi,Pi}, and they are non-classical.

3. Differentiation Phenomenon in Quantum Measurement Process

As shown in Equation (3), for the density operator

ρ=K=1MP(xk)xkxk,

where {xkxk} are the eigenstates of X, Tr(Xρ)=k=1MP(xk)xk=E(X) is satisfied. In this section, noting the non-uniqueness of decomposition of density operator, we mention the meaning of Tr(Xρ) that has not been discussed in the classical theory. Let us consider a different decomposition, ρ=i=1NPiΨiΨi, that is, we assume the existence of non-classical state distribution {Ψi,Pi}. Then, Tr(Xρ) is described as the statistical average of the averages {XΨi=Tr(XΨiΨi)} of observable X with respect to the pure states {Ψi}:

Tr(Xρ)=i=1NPiXΨi. (6)

Each term, e.g., XΨ, in the above is expanded as

XΨ=k=1Mxk|Ψ|xk|2. (7)

(k=1M|Ψ|xk|2=1 is satisfied.) The square of inner product |Ψ|xk|2 is frequently called “transition probability”. It is related to a problem of measurement that has been discussed in the quantum theory. In the concept of quantum measurement, the existence of the measurement device is considered first, because it is assumed that some interaction between the device and the system realizes the measurement of a physical quantity. Due to the interaction, the initial state of system ΨΨ is transferred to one of {xkxk}, and the values of {xk} can be read out from the device. If XΨ=k=1Mxk|Ψ|xk|2 means the average of outputs, the value of |Ψ|xk|2 corresponds to the probability of transition from ΨΨ to xkxk.

We interpret the process of quantum measurement as a sort of “differentiation”, in which a group of systems in one initial state is divided into groups having different states by means of external or environmental factors. The expected value of XΨi comes from one differentiation denoted by Ψi{xk}, and the value of Tr(Xρ)=i=1NPiXΨi is to be calculated supposing a statistical mixture of M kinds of differentiations, {Ψi{xk}}(i=1,,M).

4. Characteristic Quantity of State Distribution

We assume a definitive state distribution denoted by {Ψi,Pi} is given, and the calculations of {XΨi} are possible. The average of {XΨi}, i.e., i=1NPiXΨi=Tr(Xρ) depends only on the density operator ρ, in which {Ψi,Pi} is encoded. A statistical quantity reflecting more detailed information on the structure of {Ψi,Pi} is dispersion of {XΨi}, formulated as

V({XΨi,Pi})=i=1NPkXΨi2i=1NPiXΨi2=i=1NPiXΨi2Tr(Xρ)2. (8)

Below, we prove the inequality

V({xi,P(xi)})V({XΨi,Pi}), (9)

where V({xi,P(xi)}) is the dispersion of observable X, i.e., its dispersion with respect to the probability distribution encoded in the Shatten decomposition (see Equation (1)), corresponding to the spectral decomposition of the observable X (see Equation (2)). Thus, the probability distribution corresponding to the spectral decomposition of X maximizes the dispersions with respect to decompositions in Equation (5). The inequality for dispersions can be interpreted by the theory of weak measurements. The quantities XΨi can be interpreted as weak values. In this framework, the inequality in Equation (9) simply means that dispersion of a weak measurement is always majorized by dispersion of the “maximally disturbing measurement”, represented by a Hermitian operator. At the same time, we are aware that interpretation of weak values is a complex foundational problem of itself.

To prove the inequality in Equation (9), let us consider the first term given by

D({XΨi,Pi})=i=1NPiXΨi2. (10)

Let us note the following inequality

k=1Mxkρxk(xk)2D({XΨi,Pi})Tr(Xρ)2. (11)

which follows from the convexity of y=x2, because

i=1NPiXΨi2Tr(Xi=1NPiΨiΨi2=Tr(Xρ)2, (12)

and since

i=1NPiXΨi2=i=1NPik=1Mxkxk|Ψi22

where X=k=1Mxkxkxk, one can see

i=1NPiXΨi2i=1Nk=1MPixk|Ψi2(xk)2=k=1Mxkρxk(xk)2. (13)

Such inequality can be derived with the use of other convex functions, not limited to y=x2. Even if the dispersion V is defined as

V({XΨi,Pi})=i=1NPif(XΨi))f(Tr(Xρ)), (14)

using another convex function, e.g., f(x), the result holds true, that is, the inequality

k=1Mxkρxkf(xk)f(Tr(Xρ))V({XΨi,Pi})0, (15)

is satisfied.

We redefine the first term D of Equation (10) as

D({XΨi,Pi})=i=1NPif(XΨi)). (16)

As discussed in the next section, we believe that, under proper choices of X and f(x), this D itself becomes a quantity that captures structural features of {Ψi,Pi}.

5. State Entropy

In this section, we consider D of Equation (16) in the case of X=ρ and f(x)=logx:

D({ρΨi,Pi})=i=1NPilogρΨi. (17)

Here, we fix the state distribution {Ψi,Pi} for the density operator ρ, and y=logx is our choice of a convex function. What does the above D tell us about {Ψi,Pi}? To discuss this question, we first focus on the term of Tr(ρΨiΨi)=ρΨi. Since

ρΨi=Pi+jiPj|Ψi|Ψj|2,
1>ρΨiPi,

is satisfied. One can see, ρΨi=Pi if all the vectors {Ψji} are orthogonal to Ψi, and ρΨi=1 if all {Ψji} are parallel to Ψi. Based on this, we interpret ρΨi as a degree of “similarity” of |ΨiΨi| and ρ. This interpretation of quantity ρΨi as the degree of similarity can also be illustrated by the representation of the operators |ΨiΨi| and ρ as vectors in the Hilbert space of Hilbert–Schmidt operators endowed with the scalar product A|B=TrAB. We start with the remark that A|B=cosθABA2B2, where ·2 is the Hilbert–Schmidt norm; we also remark that, for a self-adjoint operator A,A2=TrA2. In particular, the norm of any pure state and the norm of any projector are equal to one. We have

|ΨiΨi||ρ=TrjPj|ΨiΨi||ΨjΨj|=ρΨi.

Hence,

ρΨi=cosθTrρ2,

where θ is the angle between the vectors |ΨiΨi| and ρ. The scaling coefficient Trρ2 is the purity of the state ρ.

Further, noting that y=logx is a monotonically decreasing function, we interpret logρΨi=logcosθlogTrρ2 as a degree of orthogonality between the vectors |ΨiΨi| and ρ. We note that the following inequality is satisfied: logPilog(ρΨi)>0.

In general, any convex and monotonically decreasing function is allowed as f(x). The average of orthogonality logρΨi, i.e., i=1NPi(logρΨi) corresponds to D of Equation (17). Generally, the value of D will increase in proportion to the number of states and decrease in proportion to similarities among them. That is why we call the value D “state diversity” or “state entropy”.

The following inequality shows the significance of state entropy D:

k=1Mλk(logλk)D({ρΨi,Pi})log(Tr(ρ2)). (18)

It can be derived with the use of convexity of y=logx, in a similar way as derivation of Equation (11). In the above form, {λk} are the eigenvalues of ρ=k=1Mλkϕkϕk. The term of k=1Mλk(logλk) in the left-hand side corresponds to von Neumann entropy given by ρlogρ. Further, Tr(ρ2) in the right-hand side is a well-known quantity in the quantum theory, too. The von Neumann entropy ρlogρ and Tr(ρ2) are frequently used to evaluate the degree of “mixing” in ρ: if ρ is pure, then, ρlogρ=0 and Tr(ρ2)=1. If ρ is a mixed state, ρlogρ>0 and Tr(ρ2)<1, and especially, when λ1=λ2==λM=1/M, ρlogρ takes the maximum value of logM, and Tr(ρ2) takes minimum value of 1/M. Mathematically, these two quantities have the relation of ρlogρlog(Tr(ρ2)). The inequality in Equation (18) implies that the intermediate values between these two correspond to other kinds of state entropy, which can estimated for various non-classical state distributions reducing to ρ. In other words, the well-known ρlogρ and Tr(ρ2) are newly interpreted as maximum and minimum values of state entropy.

Note that the state entropy D is different from the generalized quantum entropic measures that have been proposed until now. This point is mentioned in the Appendix A.

6. Model of Differentiation and Calculation of State Entropy

As mentioned in Section 2, a Schatten decomposition of a density operator such as Equation (2) represents a probabilistic distribution of orthogonal pure states. Such an ensemble of states is postulated to be the resulting state of the system after measurement of some physical quantity, whose eigenstates are orthogonal. On the other hand, using another decomposition of the density operator, a mixture of non-orthogonal pure states may be obtained, and we call such mixture non-classical. In Section 3, we point out that the essence of quantum measurement is state differentiation caused by external or environmental factors. If state distribution corresponding to Schatten decomposition is a goal of differentiation, various non-classical ones will appear in intermediate stages before reaching the goal. Below, we model this mechanism as proposed in [6]. This model mathematically explains what state distribution may occur in the differentiation process. Our aim in this section is to evaluate state structural features by using the state entropy defined in Section 5.

Let us consider a typical state transition caused by a quantum measurement, which is denoted by

Ψ{ψk,Pk}.

Ψ means an initial state of system represented by ΨΨ, and {ψk,Pk} means a distribution where the states {ψkψk} exist with probabilities {Pk}. {ψk} correspond to eigenstates of some physical quantity defined in Hibert space H=CM, and the initial vector Ψ is expanded as

Ψ=k=1MPkψk,

where Pk means a complex number satisfying |Pk|2=Pk, that is,

ΨΨ=k=1MPkψkψk+kkPkPkψkψk. (19)

The first term k=1MPkψkψk corresponds to the distribution {ψk,Pk}, and, therefore, vanishing of the second term, the process called “decoherence” in quantum theory, means accomplishment of the measurement. The relation of Ψ and {ψk,Pk} is represented as

k=1MMkΨΨMk=k=1M|ψk|Ψ|2ψkψk=k=1MPkψkψk, (20)

with the use of projection operator Mk=ψkψk. (The transition probability |ψk|Ψ|2 is equal to Pk.)

If the above transition is interpreted as a sort of differentiation, its development, i.e., what state distributions occur between Ψ and {ψk,Pk}, becomes a crucial concern. The model of differentiation, which was proposed in [6], presents the picture that the initial state Ψ is differentiated to {ψk,Pk} step by step through many state transitions. Each state transition is described with use of a map from state to state, which is denoted by Λ. The map is called “quantum channel” in quantum information theory. A chain of state transitions given as

ρ(0)=ΨΨρ(1)=Λ(ρ(0))ρ(2)=Λ(ρ(1))ρ(n)=Λ(ρ(n1)),

is regarded as a process of differentiation, if

linnρ(n)=k=1MPkψkψk, (21)

is satisfied. A channel Λ is to be defined based on the following: There exist numerous environmental elements around the system. Initially, states of system and these elements are given independently. Let ΦΦ be the initial state of one element, which is defined on a space K1=CL. The initial compound state of the system and the element, on the space HK1, is factorized

ΨΨΦΦ.

At the next step, the states of the system and the element become non-separable. Such compound state is generally defined as

UΨΨΦΦU,

using a unitary operator U on HK1. The unitary transformation U specifies a correlation generated between the system and the element, and, in the modeling, the following form is assumed:

U=k=1Mψkψkuk. (22)

where uk is a unitary on K1. Actually, by this U, the vector ΨΦ is transformed to

UΨΦ=k=1MPkψkΦk,

where Φk=ukΦ. Then, the states of the system and the element are “entangled”, since the above form cannot be factorized into two vectors independently defined on H and K1, if ΦkΦk for some kk. A compound state at the third step is described as

j=1L(IM¯j)UΨΨΦΦU(IM¯j). (23)

{M¯j} are projection operators corresponding to the basis set of K1=CL, say {ϕj}. As can be seen from Equation (20), a state transition given by a projection operator mathematically represents accomplishment of differentiation. The operation of {M¯j} means that the state of the element is eventually differentiated into {ϕjϕj}. Note that the states of the system and the element are correlated at the second step. Thus, the state of the system is affected by the differentiation. Actually, Equation (23) may be rewritten to

j=1LEjΨΨEjϕjϕj, (24)

by introducing the operator,

Ej=k=1Mϕj|Φkψkψk=k=1Mνj|kψkψk. (25)

The above form implies that the state of the element of the environment gets transformed to ϕjϕj with probability,

Pj=Tr(EjΨΨEjϕjϕj)=ΨEjEjΨ, (26)

and at the same time the state of the system transits to

ΨjΨj=1PjEjΨΨEj. (27)

The operator Ej introduced in Equation (25) is called Kraus operator and satisfies j=1LEjEj=I. (In general, a set of Hermitian positive operators {Fi} with i=1NFi=I is called positive-operator valued measure (POVM).) With the use of {Ej}, a quantum channel Λ is defined:

Λ(·)=j=1LEj·Ej. (28)

Λ(ΨΨ)=j=1LEjΨΨEj means the density operator obtained from the partial trace of the compound state, TrK(j=1LEjΨΨEjϕjϕj).

The other environmental elements are defined in Hilbert spaces denoted by K2,3. If they interact with the system in a similar way,

ρ(n)=Λ(ρ(n1))==Λ(Λ(Λ(ΨΨ))),

is defined as the density operator of the system that is obtained after interacting with n environmental elements. From the definition of Λ (see Equation (28)), this ρ(n) is decomposed as

ρ(n)={j1,j2,,jn}P{j1,j2,,jn}Ψ{j1,j2,,jn}Ψ{j1,j2,,jn}, (29)

where

P{j1,j2,,jn}=ΨE{j1,j2,,jn}E{j1,j2,,jn}Ψ, (30)

and

Ψ{j1,j2,,jn}=1P{j1,j2,,jn}E{j1,j2,,jn}Ψ. (31)

(The notation E{j1,j2,,jn} means EjnEj2Ej1.) P{j1,j2,,jn} is the probability that the states of n environmental elements eventually become {ϕj1,ϕj2,,ϕjn}, and Ψ{j1,j2,,jn}Ψ{j1,j2,,jn} is a pure state of the system at this event. It should be noted here that the density operator ρ(n) can be expanded as

ρ(n)=k=1MPkψkψk+kkPkPk(Φk|Φk)nψkψk, (32)

by using Equation (19) and the property of

Λ(ψkψk)=j=1LΦk|ψjψj|Φkψkψk=Φk|Φkψkψk.

Since |Φk|Φk|<1, the condition of Equation (21), i.e., linnρ(n)=k=1MPkψkψk. is clearly satisfied. Thus, the state distribution {Ψ{j1,j2,,jn},P{j1,j2,,jn}}, which is encoded in ρ(n), is identified at an intermediate stage in the differentiation process Ψ{ψk,Pk}.

Figure 1 shows the result of computational simulation with M=L=2, Ψ=0.7ψ1+0.3ψ2 (P1=0.7,P2=0.3), ν1|1=0.5(ν2|1=0.5) and ν1|2=0.45(ν2|2=0.55). The histograms of population rates of states with l120<|ψ1|Ψ{j1,j2,,jn}|2l20(l=1,2,,20) are calculated in the case of n=0,10,100,500 and 2000. One can see that with increasing n, the state distribution approaches the goal of differentiation, i.e., {{ψ1,ψ2},{0.7,0.3}}.

Figure 2 shows the behavior of the state entropy D, von Neumann entropy and log(Tr(ρ2)) for the distribution {Ψ{i1,i2,,in},P{i1,i2,,in}}, which are calculated in the same setting of parameters. One can directly see that the inequality of Equation (18) is satisfied at any n. Note that the state entropy D takes values close to von Neumann entropy at very large n. In fact, as shown in Figure 3, the difference between von Neumann entropy and the state entropy is noticeable mostly at earlier stages. These results imply that state distributions appearing in the differentiation process are non-classical in general.

7. Conclusions

The state entropy is a truly non-classical quantity because it depends not only on statistical probabilities, but also on similarities among states. The differentiation phenomenon is also non-classical, because it is interpreted as dynamics of the probabilities and similarities. Definition of the state entropy and modeling of the differentiation process are impossible in the framework of classical probability theory.

We believe that evaluation of an ensemble of systems by the state entropy fits the empirical reasoning: No matter how many systems are in the ensemble, we may not recognize high diversity if we know that these states are not very different. Further, we believe that various areas of the nature dynamics of character change in the population of individuals is very much like the differentiation phenomena. This makes prospects of the quantum-like formalism grow stronger.

Acknowledgments

I.B. was supported by Marie Curie Fellowship at City University of London, H2020-MSA-IF-2015, grant N 696331.

Appendix A. State Entropy and other Quantum Entropies

In this paper, we propose the state entropy,

D({ρΨi,Pi})=i=1NPi(log(Tr(ρΨiΨi))).

More generally, it is defined as

D({ρΨi,Pi})=i=1NPif(Tr(ρΨiΨi)),

by using a convex and monotone decreasing function f(x). Mathematically, D({ρΨi,Pi} depends on the way of decomposition of ρ. As discussed in Section 5, for ρ=i=1NPiΨiΨi, f(Tr(ρΨiΨi))=f(ρΨi) is interpreted as the degree of orthogonality between ΨiΨi and ρ. In this sense, D evaluates a sort of diversity in the state distribution {Ψi,Pi}, and it takes the maximal value equivalent to Tr(ρlogρ) for the Schatten decomposition (see Equation (18)).

On the other hand, there are many mathematical expansions of von Neumann entropy. As examples, the quantum version of Rényi entropy [51],

Rα(ρ)=11αlog(Tr(ρα)), (A1)

and the one of Tsallis entropy [52],

Tα(ρ)=11α(Tr(ρα)1), (A2)

are well-known. These entropies approach to von Neumann entropy S(ρ)=Tr(ρlogρ) in the limit α1. The index α, which is called the entropic parameter, is nonnegative and α1. Further, such generalized entropies are uniformly represented in the form of quantum version of Salicrú entropy [53,54], which is given by

H(h,ϕ)(ρ)=h(Trϕ(ρ)), (A3)

where the functions h:RR and ϕ:[0,1]R satisfy either of the following conditions: (i) h is increasing and ϕ is concave; or (ii) h is decreasing and ϕ is convex. In the form of Rényi entropy, h(x)=log(x)1α and ϕ(x)=xα, and in the form of Tsallis entropy, h(x)=x11α and ϕ(x)=xα. Of course, von Neumann entropy is also recovered at h(x)=x and ϕ(x)=xlogx.

Here, we have to point out that H(h,ϕ)(ρ) is practically calculated as

H(h,ϕ)(ρ)=h(k=1Mϕ(λk)),

by using the eigenvalues of ρ, that is, any quantum entropic measure that is reduced into H(h,ϕ)(ρ) does not depend on the way of decomposition of ρ. The state entropy is different in this point. Actually, it is clear that D({ρΨi,Pi}) is not recovered in the form of H(h,ϕ)(ρ).

Author Contributions

Conceptualization, M.A., I.B., E.M.P. and A.K.; Methodology, M.A. and A.K.; Validation, I.B., E.M.P. and A.K.; Writing-Original Draft Preparation, M.A.; Writing-Review & Editing, I.B., E.M.P. and A.K.

Conflicts of Interest

The authors declare no conflict of interest.

References

  • 1.Von Neumann J. Thermodynamik quantummechanischer Gesamheiten. Gott. Nach. 1927;1:273–291. [Google Scholar]
  • 2.Von Neumann J. Mathematische Grundlagen der Quantenmechanik. Springer; Berlin, Germany: 1932. [Google Scholar]
  • 3.Horodecki M., Oppenheim J., Winter A. Partial quantum information. Nature. 2005;436:673–676. doi: 10.1038/nature03909. [DOI] [PubMed] [Google Scholar]
  • 4.Umegaki H. Conditional expectations in an operator algebra IV (entropy and information) Kodai Math. Sem. Rep. 1962;14:59–85. doi: 10.2996/kmj/1138844604. [DOI] [Google Scholar]
  • 5.Ohya M. Fundamentals of Quantum Mutual Entropy and Capacity. Open Syst. Inf. Dyn. 1999;6:69–78. doi: 10.1023/A:1009676318267. [DOI] [Google Scholar]
  • 6.Asano M., Basieva I., Khrennikov A., Yamato I. A model of differentiation in quantum bioinformatics. Prog. Biophys. Mol. Biol. 2017;130:88–98. doi: 10.1016/j.pbiomolbio.2017.05.013. [DOI] [PubMed] [Google Scholar]
  • 7.Zurek W.H. Decoherence and the Transition from Quantum to Classical. Phys. Today. 1991;44:36–44. doi: 10.1063/1.881293. [DOI] [Google Scholar]
  • 8.Lindblad G. On the generators of quantum dynamical semigroups. Commun. Math. Phys. 1976;48:119. doi: 10.1007/BF01608499. [DOI] [Google Scholar]
  • 9.Gorini V., Kossakowski A., Sudarshan E.C.G. Completely positive semigroups of N-level systems. J. Math. Phys. 1976;17:821. doi: 10.1063/1.522979. [DOI] [Google Scholar]
  • 10.Khrennikov A. Classical and quantum mechanics on information spaces with applications to cognitive, psychological, social and anomalous phenomena. Found. Phys. 1999;29:1065–1098. doi: 10.1023/A:1018885632116. [DOI] [Google Scholar]
  • 11.Khrennikov A. Quantum-like formalism for cognitive measurements. Biosystems. 2003;70:211–233. doi: 10.1016/S0303-2647(03)00041-8. [DOI] [PubMed] [Google Scholar]
  • 12.Khrennikov A. On quantum-like probabilistic structure of mental information. Open Syst. Inf. Dyn. 2014;11:267–275. doi: 10.1023/B:OPSY.0000047570.68941.9d. [DOI] [Google Scholar]
  • 13.Khrennikov A. Information Dynamics in Cognitive, Psychological, Social, and Anomalous Phenomena. Kluwer; Dordreht, The Netherlands: 2004. (Ser.: Fundamental Theories of Physics). [Google Scholar]
  • 14.Busemeyer J.B., Wang Z., Townsend J.T. Quantum dynamics of human decision making. J. Math. Psychol. 2006;50:220–241. doi: 10.1016/j.jmp.2006.01.003. [DOI] [Google Scholar]
  • 15.Haven E. Private information and the ‘information function’: A survey of possible uses. Theory Decis. 2008;64:193–228. doi: 10.1007/s11238-007-9054-2. [DOI] [Google Scholar]
  • 16.Yukalov V.I., Sornette D. Processing Information in Quantum Decision Theory. Entropy. 2009;11:1073–1120. doi: 10.3390/e11041073. [DOI] [Google Scholar]
  • 17.Khrennikov A. Ubiquitous Quantum Structure: From Psychology to Finances. Springer; Berlin/Heidelberg, Germany: New York, NY, USA: 2010. [Google Scholar]
  • 18.Asano M., Masanori O., Tanaka Y., Khrennikov A., Basieva I. Quantum-like model of brain’s functioning: Decision making from decoherence. J. Theor. Biol. 2011;281:56–64. doi: 10.1016/j.jtbi.2011.04.022. [DOI] [PubMed] [Google Scholar]
  • 19.Busemeyer J.R., Pothos E.M., Franco R., Trueblood J. A quantum theoretical explanation for probability judgment errors. Psychol. Rev. 2011;118:193–218. doi: 10.1037/a0022542. [DOI] [PubMed] [Google Scholar]
  • 20.Asano M., Ohya M., Khrennikov A. Quantum-Like Model for Decision Making Process in Two Players Game—A Non-Kolmogorovian Model. Found. Phys. 2011;41:538–548. doi: 10.1007/s10701-010-9454-y. [DOI] [Google Scholar]
  • 21.Asano M., Ohya M., Tanaka Y., Khrennikov A., Basieva I. Dynamics of entropy in quantum-like model of decision making. AIP Conf. Proc. 2011;63:1327. [Google Scholar]
  • 22.Bagarello F. Quantum Dynamics for Classical Systems: With Applications of the Number Operator. Volume 90. Wiley; New York, NY, USA: 2012. p. 015203. [Google Scholar]
  • 23.Busemeyer J.R., Bruza P.D. Quantum Models of Cognition and Decision. Cambridge Press; Cambridge, UK: 2012. [Google Scholar]
  • 24.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y. Quantum-like dynamics of decision-making. Phys. A Stat. Mech. Appl. 2010;391:2083–2099. doi: 10.1016/j.physa.2011.11.042. [DOI] [Google Scholar]
  • 25.De Barros A.J. Quantum-like model of behavioral response computation using neural oscillators. Biosystems. 2012;110:171–182. doi: 10.1016/j.biosystems.2012.10.002. [DOI] [PubMed] [Google Scholar]
  • 26.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y. Quantum-like generalization of the Bayesian updating scheme for objective and subjective mental uncertainties. J. Math. Psychol. 2012;56:166–175. doi: 10.1016/j.jmp.2012.02.003. [DOI] [Google Scholar]
  • 27.De Barros A.J., Oas G. Negative probabilities and counter-factual reasoning in quantum cognition. Phys. Scr. 2014;T163:014008. doi: 10.1088/0031-8949/2014/T163/014008. [DOI] [Google Scholar]
  • 28.Wang Z., Busemeyer J.R. A quantum question order model supported by empirical tests of an a priori and precise prediction. Top. Cogn. Sci. 2013;5:689–710. doi: 10.1111/tops.12040. [DOI] [PubMed] [Google Scholar]
  • 29.Dzhafarov E.N., Kujala J.V. On selective influences, marginal selectivity, and Bell/CHSH inequalities. Top. Cogn. Sci. 2014;6:121–128. doi: 10.1111/tops.12060. [DOI] [PubMed] [Google Scholar]
  • 30.Wang Z., Solloway T., Shiffrin R.M., Busemeyer J.R. Context effects produced by question orders reveal quantum nature of human judgments. Proc. Natl. Acad. Sci. USA. 2014;111:9431–9436. doi: 10.1073/pnas.1407756111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Khrennikov A. Quantum-like modeling of cognition. Front. Phys. 2015;3:77. doi: 10.3389/fphy.2015.00077. [DOI] [Google Scholar]
  • 32.Boyer-Kassem T., Duchene S., Guerci E. Testing quantum-like models of judgment for question order effect. Math. Soc. Sci. 2016;80:33–46. doi: 10.1016/j.mathsocsci.2016.01.001. [DOI] [Google Scholar]
  • 33.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y. A Quantum-like Model of Selection Behavior. J. Math. Psychol. 2016 doi: 10.1016/j.jmp.2016.07.006. [DOI] [Google Scholar]
  • 34.Yukalov V.I., Sornette D. Quantum Probabilities as Behavioral Probabilities. Entropy. 2017;19:112. doi: 10.3390/e19030112. [DOI] [Google Scholar]
  • 35.Igamberdiev A.U., Shklovskiy-Kordi N.E. The quantum basis of spatiotemporality in perception and consciousnes. Prog. Biophys. Mol. Biol. 2017;130:15–25. doi: 10.1016/j.pbiomolbio.2017.02.008. [DOI] [PubMed] [Google Scholar]
  • 36.De Barros J.A., Holik F., Krause D. Contextuality and indistinguishability. Entropy. 2017;19:435. doi: 10.3390/e19090435. [DOI] [Google Scholar]
  • 37.Bagarello F., Di Salvo R., Gargano F., Oliveri F. (H,ρ)-induced dynamics and the quantum game of life. Appl. Math. Mod. 2017;43:15–32. doi: 10.1016/j.apm.2016.10.043. [DOI] [Google Scholar]
  • 38.Takahashi K.S.-J., Makoto N. A note on the roles of quantum and mechanical models in social biophysics. Prog. Biophys. Mol. Biol. 2017;130 Pt A:103–105. doi: 10.1016/j.pbiomolbio.2017.06.003. [DOI] [PubMed] [Google Scholar]
  • 39.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y., Yamato I. Quantum-like model of diauxie in Escherichia coli: Operational description of precultivation effect. J. Theor. Biol. 2012;314:130–137. doi: 10.1016/j.jtbi.2012.08.022. [DOI] [PubMed] [Google Scholar]
  • 40.Accardi L., Ohya M. Compound channels, transition expectations, and liftings. Appl. Math. Optim. 1999;39:33–59. doi: 10.1007/s002459900097. [DOI] [Google Scholar]
  • 41.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y., Yamato I. A model of epigenetic evolution based on theory of open quantum systems. Syst. Synth. Biol. 2013;7:161. doi: 10.1007/s11693-013-9109-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Asano M., Hashimoto T., Khrennikov A., Ohya M., Tanaka A. Violation of contextual generalization of the Leggett-Garg inequality for recognition of ambiguous figures. Phys. Scr. 2014;2014:T163. doi: 10.1088/0031-8949/2014/T163/014006. [DOI] [Google Scholar]
  • 43.Asano M., Basieva I., Khrennikov A., Ohya M., Tanaka Y., Yamato I. Quantum Information Biology: From Information Interpretation of Quantum Mechanics to Applications in Molecular Biology and Cognitive Psychology. Found. Phys. 2015;45:1362. doi: 10.1007/s10701-015-9929-y. [DOI] [Google Scholar]
  • 44.Asano M., Khrennikov A., Ohya M., Tanaka Y., Yamato I. Three-body system metaphor for the two-slit experiment and Escherichia coli lactose-glucose metabolism. Philos. Trans. R. Soc. A. 2016 doi: 10.1098/rsta.2015.0243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Ohya M., Volovich I. Mathematical Foundations of Quantum Information and Computation and its Applications to Nano- and Bio-Systems. Springer; Berlin, Germany: 2011. [Google Scholar]
  • 46.Asano M., Khrennikov A., Ohya M., Tanaka Y., Yamato I. Quantum Adaptivity in Biology: From Genetics to Cognition. Springer; Berlin, Germany: 2015. [Google Scholar]
  • 47.Oaksford M., Chater N. A Rational Analysis of the Selection Task as Optimal Data Selection. Psychol. Rev. 1994;101:608–631. doi: 10.1037/0033-295X.101.4.608. [DOI] [Google Scholar]
  • 48.Pothos E.M., Chater N. A simplicity principle in unsupervised human categorization. Cogn. Sci. 2002;26:303–343. doi: 10.1207/s15516709cog2603_6. [DOI] [Google Scholar]
  • 49.Miller G.A. Free Recall of Redundant Strings of Letters. J. Exp. Psychol. 1958;56:485–491. doi: 10.1037/h0044933. [DOI] [PubMed] [Google Scholar]
  • 50.Jamieson R.K., Mewhort D.J.K. The influence of grammatical, local, and organizational redundancy on implicit learning: An analysis using information theory. J. Exp. Psychol. Learn. Mem. Cogn. 2005;31:9–23. doi: 10.1037/0278-7393.31.1.9. [DOI] [PubMed] [Google Scholar]
  • 51.Rényi A. On measures of entropy and information; Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability; Berkeley, CA, USA. 20–30 July 1960; p. 547. [Google Scholar]
  • 52.Tsallis C. Possible generalization of Boltzmann–Gibbs statistics. J. Stat. Phys. 1988;52:479. doi: 10.1007/BF01016429. [DOI] [Google Scholar]
  • 53.Salicrú M., Menéndez M.L., Morales D., Pardo L. Asymptotic distribution of (h,ϕ)-entropies. Commun. Stat. Theory Methods. 1993;22:2015. doi: 10.1080/03610929308831131. [DOI] [Google Scholar]
  • 54.Bosyk G.M., Zozor S., Holik F., Portesi M., Lamberti P.W. A family of generalized quantum entropies: Definition and properties. Quantum Inf. Proc. 2016;15:3393–3420. doi: 10.1007/s11128-016-1329-5. [DOI] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES