Skip to main content
Entropy logoLink to Entropy
. 2019 May 7;21(5):477. doi: 10.3390/e21050477

Fluctuation Theorem of Information Exchange within an Ensemble of Paths Conditioned on Correlated-Microstates

Lee Jinwoo 1
PMCID: PMC7514966  PMID: 33267191

Abstract

Fluctuation theorems are a class of equalities that express universal properties of the probability distribution of a fluctuating path functional such as heat, work or entropy production over an ensemble of trajectories during a non-equilibrium process with a well-defined initial distribution. Jinwoo and Tanaka (Jinwoo, L.; Tanaka, H. Sci. Rep. 2015, 5, 7832) have shown that work fluctuation theorems hold even within an ensemble of paths to each state, making it clear that entropy and free energy of each microstate encode heat and work, respectively, within the conditioned set. Here we show that information that is characterized by the point-wise mutual information for each correlated state between two subsystems in a heat bath encodes the entropy production of the subsystems and heat bath during a coupling process. To this end, we extend the fluctuation theorem of information exchange (Sagawa, T.; Ueda, M. Phys. Rev. Lett. 2012, 109, 180602) by showing that the fluctuation theorem holds even within an ensemble of paths that reach a correlated state during dynamic co-evolution of two subsystems.

Keywords: local non-equilibrium thermodynamics, fluctuation theorem, mutual information, entropy production, local mutual information, thermodynamics of information, stochastic thermodynamics

1. Introduction

Thermal fluctuations play an important role in the functioning of molecular machines: fluctuations mediate the exchange of energy between molecules and the environment, enabling molecules to overcome free energy barriers and to stabilize in low free energy regions. They make positions and velocities random variables, and thus make path functionals such as heat and work fluctuating quantities. In the past two decades, a class of relations called fluctuation theorems have shown that there are universal laws that regulate fluctuating quantities during a process that drives a system far from equilibrium. The Jarzynski equality, for example, links work to the change of equilibrium free energy [1], and the Crooks fluctuation theorem relates the probability of work to the dissipation of work [2] if we mention a few. There are many variations on these basic relations. Seifert has extended the second-law to the level of individual trajectories [3], and Hatano and Sasa have considered transitions between steady states [4]. Experiments on single molecular levels have verified the fluctuation theorems, providing critical insights on the behavior of bio-molecules [5,6,7,8,9,10,11,12,13].

Information is an essential subtopic of fluctuation theorems [14,15,16]. Beginning with pioneering studies on feedback controlled systems [17,18], unifying formulations of information thermodynamics have been established [19,20,21,22,23]. Especially, Sagawa and Ueda have introduced information to the realm of fluctuation theorems [24]. They have established a fluctuation theorem of information exchange, unifying non-equilibrium processes of measurement and feedback control [25]. They have considered a situation where a system, say X, evolves in such a manner that depends on state y of another system Y the state of which is fixed during the evolution of the state of X. In this setup, they have shown that establishing a correlation between the two subsystems accompanies an entropy production. Very recently, we have released the constraint that Sagawa and Ueda have assumed, and proved that the same form of the fluctuation theorem of information exchange holds even when both subsystems X and Y co-evolve in time [26].

In the context of fluctuation theorems, external control λt defines a process by varying the parameter in a predetermined manner during 0tτ. One repeats the process according to initial probability distribution P0, and then, a system generates as a response an ensemble of microscopic trajectories {xt}. Jinwoo and Tanaka [27,28] have shown that the Jarzynski equality and the Crooks fluctuation theorem hold even within an ensemble of trajectories conditioned on a fixed microstate at final time τ, where the local form of non-equilibrium free energy replaces the role of equilibrium free energy in the equations, making it clear that free energy of microstate xτ encodes the amount of supplied work for reaching xτ during processes λt. Here local means that a term is related to microstate x at time τ considered as an ensemble.

In this paper, we apply this conceptual framework of considering a single microstate as an ensemble of trajectories to the fluctuation theorem of information exchange (see Figure 1a). We show that mutual information of a correlated-microstates encodes the amount of entropy production within the ensemble of paths that reach the correlated-states. This local version of the fluctuation theorem of information exchange provides much more detailed information for each correlated-microstates compared to the results in [25,26]. In the existing approaches that consider the ensemble of all paths, each point-wise mutual information does not provide specific details on a correlated-microstates, but in this new approach of focusing on a subset of the ensemble, local mutual information provides detailed knowledge on particular correlated-states.

Figure 1.

Figure 1

Ensemble of conditioned paths and dynamic information exchange: (a) Γ and Γxτ,yτ denote respectively the set of all trajectories during process λt for 0tτ and that of paths that reach (xτ,yτ) at time τ. Red curves schematically represent some members of Γxτ,yτ. (b) We magnified a single trajectory in the left panel to represent a detailed view of dynamic coupling of (xτ,yτ) during process λt. The point-wise mutual information It(xt,yt) may vary not necessarily monotonically.

We organize the paper as follows: In Section 2, we briefly review some fluctuation theorems that we have mentioned. In Section 3, we prove the main theorem and its corollary. In Section 4, we provide illustrative examples, and in Section 5, we discuss the implication of the results.

2. Conditioned Nonequilibrium Work Relations and Sagawa–Ueda Fluctuation Theorem

We consider a system in contact with a heat bath of inverse temperate β:=1/(kBT) where kB is the Boltzmann constant, and T is the temperature of the heat bath. External parameter λt drives the system away from equilibrium during 0tτ. We assume that the initial probability distribution is equilibrium one at control parameter λ0. Let Γ be the set of all microscopic trajectories, and Γxτ be that of paths conditioned on xτ at time τ. Then, the Jarzynski equality [1] and end-point conditioned version [27,28] of it read as follows:

Feq(λτ)=Feq(λ0)1βlneβWΓand (1)
F(xτ,τ)=Feq(λ0)1βlneβWΓxτ, (2)

respectively, where brackets ·Γ indicates the average over all trajectories in Γ and ·Γxτ indicates the average over trajectories reaching xτ at time τ. Here W indicates work done on the system through λt, Feq(λt) is equilibrium free energy at control parameter λt, and F(xτ,τ) is local non-equilibrium free energy of xτ at time τ. Work measurement over a specific ensemble of paths gives us equilibrium free energy as a function of λτ through Equation (1) and local non-equilibrium free energy as a micro-state function of xτ at time τ through Equation (2). The following fluctuation theorem links Equations (1) and (2):

eβF(xτ,τ)xτ=eβFeq(λτ), (3)

where brackets ·xτ indicates the average over all microstates xτ at time τ [27,28]. Defining the reverse process by λt:=λτt for 0tτ, the Crooks fluctuation theorem [2] and end-point conditioned version [27,28] of it read as follows:

PΓ(W)PΓ(W)=expWΔFeqkBTand (4)
PΓxτ(W)PΓxτ(W)=expWΔFkBT, (5)

respectively, where PΓ(W) and PΓxτ(W) are probability distributions of work W normalized over all paths in Γ and Γxτ, respectively. Here P indicates corresponding probabilities for the reverse process. For Equation (4), the initial probability distribution of the reverse process is an equilibrium one at control parameter λτ. On the other hand, for Equation (5), the initiail probability distribution for the reverse process should be non-equilibrium probability distribution p(xτ,τ) of the forward process at control parameter λτ. By identifying such W that PΓ(W)=PΓ(W), one obtains ΔFeq:=Feq(λτ)Feq(λ0), the difference in equilibrium free energy between λ0 and λτ, through Equation (4) [9]. Similar identification may provide ΔF:=F(xτ,τ)Feq(λ0) through Equation (5).

Now we turn to the Sagawa–Ueda fluctuation theorem of information exchange [25]. Specifically, we discuss the generalized version [26] of it. To this end, we consider two subsystems X and Y in the heat bath of inverse temperature β. During process λt, they interact and co-evolve with each other. Then, the fluctuation theorem of information exchange reads as follows:

eσ+ΔIΓ=1, (6)

where brackets indicate the ensemble average over all paths of the combined subsystems, and σ is the sum of entropy production of system X, system Y, and the heat bath, and ΔI is the change in mutual information between X and Y. We note that in the original version of the Sagawa–Ueda fluctuation theorem, only system X is in contact with the heat bath and Y does not evolve during the process [25,26]. In this paper, we prove an end-point conditioned version of Equation (6):

Iτ(xτ,yτ)=lne(σ+I0)xτ,yτ, (7)

where brackets indicate the ensemble average over all paths to xτ and yτ at time τ, and It (0tτ) is local form of mutual information between microstates of X and Y at time t (see Figure 1b). If there is no initial correlation, i.e., I0=0, Equation (7) clearly indicates that local mutual information Iτ as a function of correlated-microstates (xτ,yτ) encodes entropy production σ within the end-point conditioned ensemble of paths. In the same vein, we may interpret initial correlation I0 as encoded entropy production for the preparation of the initial condition.

3. Results

3.1. Theoretical Framework

Let X and Y be finite classical stochastic systems in the heat bath of inverse temperate β. We allowed external parameter λt drives one or both subsystems away from equilibrium during time 0tτ [29,30,31]. We assumed that classical stochastic dynamics describes the time evolution of X and Y during process λt along trajectories {xt} and {yt}, respectively, where xt (yt) denotes a specific microstate of X (Y) at time t for 0tτ on each trajectory. Since trajectories fluctuate, we repeated process λt with initial joint probability distribution P0(x,y) over all microstates (x,y) of systems X and Y. Then the subsystems may generate a joint probability distribution Pt(x,y) for 0tτ. Let Pt(x):=Pt(x,y)dy and Pt(y):=Pt(x,y)dx be the corresponding marginal probability distributions. We assumed

P0(x,y)0for all(x,y), (8)

so that we have Pt(x,y)0, Pt(x)0, and Pt(y)0 for all x and y during 0tτ. Now we consider entropy production σ of system X along {xt}, system Y along {yt}, and heat bath Qb during process λt for 0tτ as follows

σ:=Δs+βQb, (9)

where

Δs:=Δsx+Δsy,Δsx:=lnPτ(xτ)+lnP0(x0),Δsy:=lnPτ(yτ)+lnP0(y0). (10)

We remark that Equation (10) is different from the change of stochastic entropy of combined super-system composed of X and Y, which reads lnP0(x0,y0)lnPτ(xτ,yτ) that reduces to Equation (10) if processes {xt} and {yt} are independent. The discrepancy leaves room for correlation Equation (11) below [25]. Here the stochastic entropy s[Pt()]:=lnPt() of microstate ○ at time t is uncertainty of ○ at time t: the more uncertain that microstate ○ occurs, the greater the stochastic entropy of ○ is. We also note that in [25], system X was in contact with the heat reservoir, but system Y was not. Nor did system Y evolve. Thus their entropy production reads σsu:=Δsx+βQb.

Now we assume, during process λt, that system X exchanged information with system Y. By this, we mean that trajectory {xt} of system X evolved depending on the trajectory {yt} of system Y (see Figure 1b). Then, the local form of mutual information It at time t between xt and yt is the reduction of uncertainty of xt due to given yt [25]:

It(xt,yt):=s[Pt(xt)]s[Pt(xt|yt)]=lnPt(xt,yt)Pt(xt)Pt(yt), (11)

where Pt(xt|yt) is the conditional probability distribution of xt given yt. The more information was being shared between xt and yt for their occurrence, the larger the value of It(xt,yt) was. We note that if xt and yt were independent at time t, It(xt,yt) became zero. The average of It(xt,yt) with respect to Pt(xt,yt) over all microstates is the mutual information between the two subsystems, which was greater than or equal to zero [32].

3.2. Proof of Fluctuation Theorem of Information Exchange Conditioned on a Correlated-Microstates

Now we are ready to prove the fluctuation theorem of information exchange conditioned on a correlated-microstates. We define reverse process λt:=λτt for 0tτ, where the external parameter is time-reversed [33,34]. The initial probability distribution P0(x,y) for the reverse process should be the final probability distribution for the forward process Pτ(x,y) so that we have

P0(x)=P0(x,y)dy=Pτ(x,y)dy=Pτ(x),P0(y)=P0(x,y)dx=Pτ(x,y)dx=Pτ(y). (12)

Then, by Equation (8), we have Pt(x,y)0, Pt(x)0, and Pt(y)0 for all x and y during 0tτ. For each trajectories {xt} and {yt} for 0tτ, we define the time-reversed conjugate as follows:

{xt}:={xτt*},{yt}:={yτt*}, (13)

where ∗ denotes momentum reversal. Let Γ be the set of all trajectories {xt} and {yt}, and Γxτ,yτ be that of trajectories conditioned on correlated-microstates (xτ,yτ) at time τ. Due to time-reversal symmetry of the underlying microscopic dynamics, the set Γ of all time-reversed trajectories was identical to Γ, and the set Γx0,y0 of time-reversed trajectories conditioned on x0 and y0 was identical to Γxτ,yτ. Thus we may use the same notation for both forward and backward pairs. We note that the path probabilities PΓ and PΓxτ,yτ were normalized over all paths in Γ and Γxτ,yτ, respectively (see Figure 1a). With this notation, the microscopic reversibility condition that enables us to connect the probability of forward and reverse paths to dissipated heat reads as follows [2,35,36,37]:

PΓ({xt},{yt}|x0,y0)PΓ({xt},{yt}|x0,y0)=eβQb, (14)

where PΓ({xt},{yt}|x0,y0) is the conditional joint probability distribution of paths {xt} and {yt} conditioned on initial microstates x0 and y0, and PΓ({xt},{yt}|x0,y0) is that for the reverse process. Now we restrict our attention to those paths that are in Γxτ,yτ, and divide both numerator and denominator of the left-hand side of Equation (14) by Pτ(xτ,yτ). Since Pτ(xτ,yτ) is identical to P0(x0,y0), Equation (14) becomes as follows:

PΓxτ,yτ({xt},{yt}|x0,y0)PΓxτ,yτ({xt},{yt}|x0,y0)=eβQb, (15)

since the probability of paths is now normalized over Γxτ,yτ. Then we have the following:

PΓxτ,yτ({xt},{yt})PΓxτ,yτ({xt},{yt})=PΓxτ,yτ({xt},{yt}|x0,y0)PΓxτ,yτ({xt},{yt}|x0,y0)·P0(x0,y0)P0(x0,y0) (16)
=PΓxτ,yτ({xt},{yt}|x0,y0)PΓxτ,yτ({xt},{yt}|x0,y0)·P0(x0,y0)P0(x0)p0(y0)·P0(x0)P0(y0)P0(x0,y0)×P0(x0)P0(x0)·P0(y0)P0(y0) (17)
=exp{βQb+Iτ(xτ,yτ)I0(x0,y0)ΔsxΔsy} (18)
=exp{σ+Iτ(xτ,yτ)I0(x0,y0)}. (19)

To obtain Equation (17) from Equation (16), we multiply Equation (16) by P0(x0)P0(y0)P0(x0)P0(y0) and P0(x0)P0(y0)P0(x0)P0(y0), which are 1. We obtain Equation (18) by applying Equations (10)–(12) and (15) to Equation (17). Finally, we use Equation (9) to obtain Equation (19) from Equation (18). Now we multiply both sides of Equation (19) by eIτ(xτ,yτ) and PΓxτ,yτ({xt},{yt}), and take integral over all paths in Γxτ,yτ to obtain the fluctuation theorem of information exchange conditioned on a correlated-microstates:

e(σ+I0)xτ,yτ:={xt},{yt}Γ{xτ},{yτ}e(σ+I0)PΓxτ,yτ({xt},{yt})d{xt}d{yt}={xt},{yt}Γ{xτ},{yτ}eIτ(xτ,yτ)PΓxτ,yτ({xt},{yt})d{xt}d{yt}=eIτ(xτ,yτ){xt},{yt}Γ{xτ},{yτ}PΓxτ,yτ({xt},{yt})d{xt}d{yt}=eIτ(xτ,yτ). (20)

Here we use the fact that eIτ(xτ,yτ) is constant for all paths in Γxτ,yτ, probability distribution PΓxτ,yτ is normalized over all paths in Γxτ,yτ, and d{xt}=d{xt} and d{yt}=d{yt} due to the time-reversal symmetry [38]. Equation (20) clearly shows that just as local free energy encodes work [27], and local entropy encodes heat [28], the local form of mutual information between correlated-microstates (xτ,yτ) encodes entropy production, within the ensemble of paths that reach each microstate. The following corollary provides more information on entropy production in terms of energetic costs.

3.3. Corollary

To discuss entropy production in terms of energetic costs, we define local free energy Fx of xt and Fy of yt at control parameter λt as follows:

Fx(xt,t):=Ex(xt,t)kBTs[Pt(xt)]Fy(yt,t):=Ey(yt,t)kBTs[Pt(yt)], (21)

where T is the temperature of the heat bath, kB is the Boltzmann constant, Ex and Ey are internal energy of systems X and Y, respectively, and s[Pt()]:=lnPt() is stochastic entropy [2,3]. Work done on either one or both systems through process λt is expressed by the first law of thermodynamics as follows:

W:=ΔE+Qb, (22)

where ΔE is the change in internal energy of the total system composed of X and Y. If we assume that systems X and Y are weakly coupled, in that interaction energy between X and Y is negligible compared to the internal energy of X and Y, we may have

ΔE:=ΔEx+ΔEy, (23)

where ΔEx:=Ex(xτ,τ)Ex(x0,0) and ΔEy:=Ey(yτ,τ)Ey(y0,0) [39]. We rewrite Equation (18) by adding and subtracting the change of internal energy ΔEx of X and ΔEy of Y as follows:

PΓxτ,yτ({xt},{yt})PΓxτ,yτ({xt},{yt})=exp{β(Qb+ΔEx+ΔEy)+βΔExΔsx+βΔEyΔsy}×exp{Iτ(xτ,yτ)I0(x0,y0)} (24)
=exp{β(WΔFxΔFy)+Iτ(xτ,yτ)I0(x0,y0)}, (25)

where we have applied Equations (21)–(23) consecutively to Equation (24) to obtain Equation (25). Here ΔFx:=Fx(xτ,τ)Fx(x0,0) and ΔFy:=Fy(yτ,τ)Fy(y0,0). Now we multiply both sides of Equation (25) by eIτ(xτ,yτ) and PΓxτ,yτ({xt},{yt}), and take integral over all paths in Γxτ,yτ to obtain the following:

eβ(WΔFxΔFy)I0xτ,yτ:={xt},{yt}Γ{xτ},{yτ}eβ(WΔFxΔFy)I0PΓxτ,yτ({xt},{yt})d{xt}d{yt}={xt},{yt}Γ{xτ},{yτ}eIτ(xτ,yτ)PΓxτ,yτ({xt},{yt})d{xt}d{yt}=eIτ(xτ,yτ), (26)

which generalizes known relations in the literature [24,39,40,41,42,43]. We note that Equation (26) holds under the weak-coupling assumption between systems X and Y during process λt, and ΔFx+ΔFy in Equation (26) is the difference in non-equilibrium free energy, which is different from the change in equilibrium free energy that appears in similar relations in the literature [24,40,41,42,43]. If there is no initial correlation, i.e., I0=0, Equation (26) indicates that local mutual information Iτ as a state function of correlated-microstates (xτ,yτ) encodes entropy production, β(WΔFxΔFy), within the ensemble of paths in Γxτ,yτ. In the same vein, we may interpret initial correlation I0 as encoded entropy-production for the preparation of the initial condition.

In [25], they showed that the entropy of X can be decreased without any heat flow due to the negative mutual information change under the assumption that one of the two systems does not evolve in time. Equation (20) implies that the negative mutual information change can decrease the entropy of X and that of Y simultaneously without any heat flow by the following:

Δsx+Δsyxτ,yτΔIτ(xτ,yτ), (27)

provided Qbxτ,yτ=0. Here ΔIτ(xτ,yτ):=Iτ(xτ,yτ)I0(x0,y0)x0,y0. In terms of energetics, Equation (26) implies that the negative mutual information change can increase the free energy of X and that of Y simultaneously without any external-supply of energy by the following:

ΔIτ(xτ,yτ)βΔFx+ΔFyxτ,yτ (28)

provided Wxτ,yτ=0.

4. Examples

4.1. A Simple One

Let X and Y be two systems that weakly interact with each other, and be in contact with the heat bath of inverse temperature β. We may think of X and Y, for example, as bio-molecules that interact with each other or X as a device which measures the state of other system and Y be a measured system. We consider a dynamic coupling process as follows: Initially, X and Y are separately in equilibrium such that the initial correlation I0(x0,y0) is zero for all x0 and y0. At time t=0, system X starts (weak) interaction with system Y until time t=τ. During the coupling process, external parameter λt for 0tτ may exchange work with either one or both systems (see Figure 1b). Since each process fluctuates, we repeat the process many times to obtain probability distribution Pt(x,y) for 0tτ. We allow both systems co-evolve interactively and thus It(xt,yt) may vary not necessarily monotonically. Let us assume that the final probability distribution Pτ(xτ,yτ) is as shown in Table 1.

Table 1.

The joint probability distribution of x and y at final time τ: Here we assume that both systems X and Y have three states, 0, 1, and 2.

X\Y 0 1 2
0 1/6 1/9 1/18
1 1/18 1/6 1/9
2 1/9 1/18 1/6

Then, a few representative mutual information read as follows:

Iτ(xτ=0,yτ=0)=ln1/6(1/3)·(1/3)=ln(3/2),Iτ(xτ=0,yτ=1)=ln1/9(1/3)·(1/3)=0,Iτ(xτ=0,yτ=2)=ln1/18(1/3)·(1/3)=ln(1/2). (29)

By Jensen’s inequality [32], Equation (20) implies

σxτ,yτIτ(xτ,yτ). (30)

Thus coupling xτ=0,yτ=0 accompanies on average entropy production of at least ln(3/2) which is greater than 0. Coupling xτ=0,yτ=1 may not produce entropy on average. Coupling xτ=0,yτ=2 on average may produce negative entropy by ln(1/2)=ln2. Three individual inequalities provide more detailed information than that from σΓIτ(xτ,yτ)Γ0.0872 currently available from [25,26].

4.2. A “Tape-Driven” Biochemical Machine

In [44], McGrath et al. proposed a physically realizable device that exploits or creates mutual information, depending on system parameters. The system is composed of an enzyme E in a chemical bath, interacting with a tape that is decorated with a set of pairs of molecules (see Figure 2a). A pair is composed of substrate molecule X (or phosphorylated X*) and activator Y of the enzyme (or Y¯ which denotes the absence of Y). The binding of molecule Y to E converts the enzyme into active mode E, which catalyzes phosphate exchange between ATP and X:

X+ATP+EE-X-ADP-PiE+X*+ADP. (31)

Figure 2.

Figure 2

Analysis of a “tape-driven” biochemical machine: (a) a schematic illustration of enzyme E, pairs of X(X*) and Y(Y¯) in the chemical bath including ATP and ADP. (b) The graph of ItΓ as a function of time t, which shows the non-monotonicity of ItΓ. (c) The graph of It(X,Y) which decreases monotonically and composed of trajectories that harness mutual information to work against the chemical bath. (d) The graph of It(X*,Y) that increases monotonically and composed of paths that create mutual information between X* and Y.

The tape is prepared in a correlated manner through a single parameter Ψ:

p0(Y¯|X*)=p0(Y|X)=Ψ,p0(Y|X*)=p0(Y¯|X)=1Ψ. (32)

If Ψ<0.5, a pair of Y and X* is abundant so that the interaction of enzyme E with molecule Y activates the enzyme, causing the catalytic reaction of Equation (31) from the right to the left, resulting in the production of ATP from ADP. If the bath were prepared such that [ATP]>[ADP], the reaction corresponds to work on the chemical bath against the concentration gradient. Note that this interaction causes the conversion of X* to X, which reduces the initial correlation between X* and Y, resulting in the conversion of mutual information into work. If E interacts with a pair of Y¯ and X which is also abundant for Ψ<0.5, the enzyme becomes inactive due to the absence of Y, preventing the reaction Equation (31) from the left to the right, which plays as a ratchet that blocks the conversion of X and ATP to X* and ADP, which might happen otherwise due to the the concentration gradient of the bath.

On the other hand, if Ψ>0.5, a pair of Y and X is abundant which allows the enzyme to convert X into X* using the pressure of the chemical bath, creating the correlation between Y and X*. If E interacts with a pair of Y¯ and X* which is also abundant for Ψ>0.5, the enzyme is again inactive, preventing the de-phosphorylation of X*, keeping the created correlation. In this regime, the net effect is the conversion of work (due to the chemical gradient of the bath) to mutual information. The concentration of ATP and ADP in the chemical bath is adjusted via α(1,1) such that

[ATP]=1+αand[ADP]=1α (33)

relative to a reference concentration C0. For the analysis of various regimes of different parameters, we refer the reader to [44].

In this example, we concentrate on the case with α=0.99 and Ψ=0.69, where Ref. [44] pays a special attention. They analyzed the dynamics of mutual information ItΓ during 102t102. Due to the high initial correlation, the enzyme converts the mutual information between X* and Y into work against the pressure of the chemical bath with [ATP]>[ADP]. As the reactions proceed, correlation ItΓ drops until the minimum reaches, which is zero. Then, eventually the reaction is inverted, and the bath begins with working to create mutual information between X* and Y as shown in Figure 2b.

We split the ensemble Γt of paths into ΓX,Yt composed of trajectories reaching (X,Y) at each t and ΓX*,Yt composed of those reaching (X*,Y) at time t. Then, we calculate It(X,Y) and It(X*,Y) using the analytic form of probability distributions that they derived. Figure 2c,d show It(X,Y) and It(X*,Y), respectively, as a function of time t. During the whole process, mutual information It(X,Y) monotonically decreases. For 102t101/3, it keeps positive, and after that, it becomes negative which is possible for local mutual information. Trajectories in ΓX,Y harness mutual information between X* and Y, converting X* to X and ADP to ATP against the chemical bath. Contrary to this, It(X*,Y) increases monotonically. It becomes positive after t>101/3, indicating that the members in ΓX*,Yt create mutual information between X* and Y by converting X to X* using the excess of ATP in the chemical bath. The effect accumulates, and the negative values of It(X*,Y) turn to the positive after t>101/3.

5. Conclusions

We have proved the fluctuation theorem of information exchange conditioned on correlated-microstates, Equation (20), and its corollary, Equation (26). Those theorems make it clear that local mutual information encodes as a state function of correlated-states entropy production within an ensemble of paths that reach the correlated-states. Equation (20) also reproduces lower bound of entropy production, Equation (30), within a subset of path-ensembles, which provides more detailed information than the fluctuation theorem involved in the ensemble of all paths. Equation (26) enables us to know the exact relationship between work, non-equilibrium free energy, and mutual information. This end-point conditioned version of the theorem also provides more detailed information on the energetics for coupling than current approaches in the literature. This robust framework may be useful to analyze thermodynamics of dynamic molecular information processes [44,45,46] and to analyze dynamic allosteric transitions [47,48].

Funding

L.J. was supported by the National Research Foundation of Korea Grant funded by the Korean Government (NRF-2010-0006733, NRF-2012R1A1A2042932, NRF-2016R1D1A1B02011106), and in part by Kwangwoon University Research Grant in 2017.

Conflicts of Interest

The author declares no conflict of interest.

References

  • 1.Jarzynski C. Nonequilibrium equality for free energy differences. Phys. Rev. Lett. 1997;78:2690–2693. doi: 10.1103/PhysRevLett.78.2690. [DOI] [Google Scholar]
  • 2.Crooks G.E. Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. Phys. Rev. E. 1999;60:2721–2726. doi: 10.1103/PhysRevE.60.2721. [DOI] [PubMed] [Google Scholar]
  • 3.Seifert U. Entropy production along a stochastic trajectory and an integral fluctuation theorem. Phys. Rev. Lett. 2005;95:040602. doi: 10.1103/PhysRevLett.95.040602. [DOI] [PubMed] [Google Scholar]
  • 4.Hatano T., Sasa S.I. Steady-state thermodynamics of Langevin systems. Phys. Rev. Lett. 2001;86:3463–3466. doi: 10.1103/PhysRevLett.86.3463. [DOI] [PubMed] [Google Scholar]
  • 5.Hummer G., Szabo A. Free energy reconstruction from nonequilibrium single-molecule pulling experiments. Proc. Natl. Acad. Sci. USA. 2001;98:3658–3661. doi: 10.1073/pnas.071034098. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Liphardt J., Onoa B., Smith S.B., Tinoco I., Bustamante C. Reversible unfolding of single RNA molecules by mechanical force. Science. 2001;292:733–737. doi: 10.1126/science.1058498. [DOI] [PubMed] [Google Scholar]
  • 7.Liphardt J., Dumont S., Smith S., Tinoco I., Jr., Bustamante C. Equilibrium information from nonequilibrium measurements in an experimental test of Jarzynski’s equality. Science. 2002;296:1832–1835. doi: 10.1126/science.1071152. [DOI] [PubMed] [Google Scholar]
  • 8.Trepagnier E.H., Jarzynski C., Ritort F., Crooks G.E., Bustamante C.J., Liphardt J. Experimental test of Hatano and Sasa’s nonequilibrium steady-state equality. Proc. Natl. Acad. Sci. USA. 2004;101:15038–15041. doi: 10.1073/pnas.0406405101. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Collin D., Ritort F., Jarzynski C., Smith S.B., Tinoco I., Bustamante C. Verification of the Crooks fluctuation theorem and recovery of RNA folding free energies. Nature. 2005;437:231–234. doi: 10.1038/nature04061. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Alemany A., Mossa A., Junier I., Ritort F. Experimental free-energy measurements of kinetic molecular states using fluctuation theorems. Nat. Phys. 2012;8:688–694. doi: 10.1038/nphys2375. [DOI] [Google Scholar]
  • 11.Holubec V., Ryabov A. Cycling Tames Power Fluctuations near Optimum Efficiency. Phys. Rev. Lett. 2018;121:120601. doi: 10.1103/PhysRevLett.121.120601. [DOI] [PubMed] [Google Scholar]
  • 12.Šiler M., Ornigotti L., Brzobohatý O., Jákl P., Ryabov A., Holubec V., Zemánek P., Filip R. Diffusing up the Hill: Dynamics and Equipartition in Highly Unstable Systems. Phys. Rev. Lett. 2018;121:230601. doi: 10.1103/PhysRevLett.121.230601. [DOI] [PubMed] [Google Scholar]
  • 13.Ciliberto S. Experiments in Stochastic Thermodynamics: Short History and Perspectives. Phys. Rev. X. 2017;7:021051. doi: 10.1103/PhysRevX.7.021051. [DOI] [Google Scholar]
  • 14.Strasberg P., Schaller G., Brandes T., Esposito M. Quantum and Information Thermodynamics: A Unifying Framework Based on Repeated Interactions. Phys. Rev. X. 2017;7:021003. doi: 10.1103/PhysRevX.7.021003. [DOI] [Google Scholar]
  • 15.Demirel Y. Information in biological systems and the fluctuation theorem. Entropy. 2014;16:1931–1948. doi: 10.3390/e16041931. [DOI] [Google Scholar]
  • 16.Demirel Y. Nonequilibrium Thermodynamics: Transport and Rate Processes in Physical, Chemical and Biological Systems. 4th ed. Elsevier; Amsterdam, The Netherlands: 2019. [Google Scholar]
  • 17.Lloyd S. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics. Phys. Rev. A. 1989;39:5378–5386. doi: 10.1103/PhysRevA.39.5378. [DOI] [PubMed] [Google Scholar]
  • 18.Cao F.J., Feito M. Thermodynamics of feedback controlled systems. Phys. Rev. E. 2009;79:041118. doi: 10.1103/PhysRevE.79.041118. [DOI] [PubMed] [Google Scholar]
  • 19.Gaspard P. Multivariate fluctuation relations for currents. New J. Phys. 2013;15:115014. doi: 10.1088/1367-2630/15/11/115014. [DOI] [Google Scholar]
  • 20.Barato A.C., Seifert U. Unifying Three Perspectives on Information Processing in Stochastic Thermodynamics. Phys. Rev. Lett. 2014;112:090601. doi: 10.1103/PhysRevLett.112.090601. [DOI] [PubMed] [Google Scholar]
  • 21.Barato A.C., Seifert U. Stochastic thermodynamics with information reservoirs. Phys. Rev. E. 2014;90:042150. doi: 10.1103/PhysRevE.90.042150. [DOI] [PubMed] [Google Scholar]
  • 22.Horowitz J.M., Esposito M. Thermodynamics with continuous information flow. Phys. Rev. X. 2014;4:031015. doi: 10.1103/PhysRevX.4.031015. [DOI] [Google Scholar]
  • 23.Rosinberg M.L., Horowitz J.M. Continuous information flow fluctuations. EPL Europhys. Lett. 2016;116:10007. doi: 10.1209/0295-5075/116/10007. [DOI] [Google Scholar]
  • 24.Sagawa T., Ueda M. Generalized Jarzynski equality under nonequilibrium feedback control. Phys. Rev. Lett. 2010;104:090602. doi: 10.1103/PhysRevLett.104.090602. [DOI] [PubMed] [Google Scholar]
  • 25.Sagawa T., Ueda M. Fluctuation theorem with information exchange: Role of correlations in stochastic thermodynamics. Phys. Rev. Lett. 2012;109:180602. doi: 10.1103/PhysRevLett.109.180602. [DOI] [PubMed] [Google Scholar]
  • 26.Jinwoo L. Fluctuation Theorem of Information Exchange between Subsystems that Co-Evolve in Time. Symmetry. 2019;11:433. doi: 10.3390/sym11030433. [DOI] [Google Scholar]
  • 27.Jinwoo L., Tanaka H. Local non-equilibrium thermodynamics. Sci. Rep. 2015;5:7832. doi: 10.1038/srep07832. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Jinwoo L., Tanaka H. Trajectory-ensemble-based nonequilibrium thermodynamics. arXiv. 20141403.1662v1 [Google Scholar]
  • 29.Jarzynski C. Equalities and inequalities: Irreversibility and the second law of thermodynamics at the nanoscale. Annu. Rev. Codens. Matter Phys. 2011;2:329–351. doi: 10.1146/annurev-conmatphys-062910-140506. [DOI] [Google Scholar]
  • 30.Seifert U. Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 2012;75:126001. doi: 10.1088/0034-4885/75/12/126001. [DOI] [PubMed] [Google Scholar]
  • 31.Spinney R., Ford I. Nonequilibrium Statistical Physics of Small Systems. Wiley-VCH Verlag GmbH & Co. KGaA; Weinheim, Germany: 2013. Fluctuation Relations: A Pedagogical Overview; pp. 3–56. [Google Scholar]
  • 32.Cover T.M., Thomas J.A. Elements of Information Theory. John Wiley & Sons; Hoboken, NJ, USA: 2012. [Google Scholar]
  • 33.Ponmurugan M. Generalized detailed fluctuation theorem under nonequilibrium feedback control. Phys. Rev. E. 2010;82:031129. doi: 10.1103/PhysRevE.82.031129. [DOI] [PubMed] [Google Scholar]
  • 34.Horowitz J.M., Vaikuntanathan S. Nonequilibrium detailed fluctuation theorem for repeated discrete feedback. Phys. Rev. E. 2010;82:061120. doi: 10.1103/PhysRevE.82.061120. [DOI] [PubMed] [Google Scholar]
  • 35.Kurchan J. Fluctuation theorem for stochastic dynamics. J. Phys. A Math. Gen. 1998;31:3719. doi: 10.1088/0305-4470/31/16/003. [DOI] [Google Scholar]
  • 36.Maes C. The fluctuation theorem as a Gibbs property. J. Stat. Phys. 1999;95:367–392. doi: 10.1023/A:1004541830999. [DOI] [Google Scholar]
  • 37.Jarzynski C. Hamiltonian derivation of a detailed fluctuation theorem. J. Stat. Phys. 2000;98:77–102. doi: 10.1023/A:1018670721277. [DOI] [Google Scholar]
  • 38.Goldstein H., Poole C., Safko J. Classical Mechanics. 3rd ed. Pearson; London, UK: 2001. [Google Scholar]
  • 39.Parrondo J.M., Horowitz J.M., Sagawa T. Thermodynamics of information. Nat. Phys. 2015;11:131–139. doi: 10.1038/nphys3230. [DOI] [Google Scholar]
  • 40.Kawai R., Parrondo J.M.R., den Broeck C.V. Dissipation: The phase-space perspective. Phys. Rev. Lett. 2007;98:080602. doi: 10.1103/PhysRevLett.98.080602. [DOI] [PubMed] [Google Scholar]
  • 41.Takara K., Hasegawa H.H., Driebe D.J. Generalization of the second law for a transition between nonequilibrium states. Phys. Lett. A. 2010;375:88–92. doi: 10.1016/j.physleta.2010.11.002. [DOI] [Google Scholar]
  • 42.Hasegawa H.H., Ishikawa J., Takara K., Driebe D.J. Generalization of the second law for a nonequilibrium initial state. Phys. Lett. A. 2010;374:1001–1004. doi: 10.1016/j.physleta.2009.12.042. [DOI] [Google Scholar]
  • 43.Esposito M., Van den Broeck C. Second law and Landauer principle far from equilibrium. Europhys. Lett. 2011;95:40004. doi: 10.1209/0295-5075/95/40004. [DOI] [Google Scholar]
  • 44.McGrath T., Jones N.S., ten Wolde P.R., Ouldridge T.E. Biochemical Machines for the Interconversion of Mutual Information and Work. Phys. Rev. Lett. 2017;118:028101. doi: 10.1103/PhysRevLett.118.028101. [DOI] [PubMed] [Google Scholar]
  • 45.Becker N.B., Mugler A., ten Wolde P.R. Optimal Prediction by Cellular Signaling Networks. Phys. Rev. Lett. 2015;115:258103. doi: 10.1103/PhysRevLett.115.258103. [DOI] [PubMed] [Google Scholar]
  • 46.Ouldridge T.E., Govern C.C., ten Wolde P.R. Thermodynamics of Computational Copying in Biochemical Systems. Phys. Rev. X. 2017;7:021004. doi: 10.1103/PhysRevX.7.021004. [DOI] [Google Scholar]
  • 47.Tsai C.J., Nussinov R. A unified view of “how allostery works”. PLoS Comput. Biol. 2014;10:e1003394. doi: 10.1371/journal.pcbi.1003394. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Cuendet M.A., Weinstein H., LeVine M.V. The allostery landscape: Quantifying thermodynamic couplings in biomolecular systems. J. Chem. Theory Comput. 2016;12:5758–5767. doi: 10.1021/acs.jctc.6b00841. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES