Skip to main content
Philosophical transactions. Series A, Mathematical, physical, and engineering sciences logoLink to Philosophical transactions. Series A, Mathematical, physical, and engineering sciences
. 2018 May 28;376(2123):20180107. doi: 10.1098/rsta.2018.0107

Quantum theory of the classical: quantum jumps, Born’s Rule and objective classical reality via quantum Darwinism

Wojciech Hubert Zurek 1,
PMCID: PMC5990654  PMID: 29807905

Abstract

The emergence of the classical world from the quantum substrate of our Universe is a long-standing conundrum. In this paper, I describe three insights into the transition from quantum to classical that are based on the recognition of the role of the environment. I begin with the derivation of preferred sets of states that help to define what exists—our everyday classical reality. They emerge as a result of the breaking of the unitary symmetry of the Hilbert space which happens when the unitarity of quantum evolutions encounters nonlinearities inherent in the process of amplification—of replicating information. This derivation is accomplished without the usual tools of decoherence, and accounts for the appearance of quantum jumps and the emergence of preferred pointer states consistent with those obtained via environment-induced superselection, or einselection. The pointer states obtained in this way determine what can happen—define events—without appealing to Born’s Rule for probabilities. Therefore, pk=|ψk|2 can now be deduced from the entanglement-assisted invariance, or envariance—a symmetry of entangled quantum states. With probabilities at hand, one also gains new insights into the foundations of quantum statistical physics. Moreover, one can now analyse the information flows responsible for decoherence. These information flows explain how the perception of objective classical reality arises from the quantum substrate: the effective amplification that they represent accounts for the objective existence of the einselected states of macroscopic quantum systems through the redundancy of pointer state records in their environment—through quantum Darwinism.

This article is part of a discussion meeting issue ‘Foundations of quantum mechanics and their impact on contemporary society’.

Keywords: quantum Darwinism, decoherence, quantum jumps, probabilities, Born’s Rule

1. Introduction and preview

This survey article is not a comprehensive review. It is, nevertheless, a brief review of several interrelated developments that can be collectively described as the ‘quantum theory of classical reality’.

Two mini-reviews in Nature Physics [1] and Physics Today [2] are also available. A more detailed review is given in [3]. It is by now somewhat out of date, as several relevant results have been obtained since 2007 when it was written. Moreover, a book that will cover this same ground, as well as the theory of decoherence and other related subjects, is (slowly) being written [4]. Nevertheless, it is hoped that readers may appreciate, in the interim, an update as well as the more informal presentation style of this overview.

The ‘relative state interpretation’ set out 50 years ago by Hugh Everett III [5,6] is a convenient starting point for our discussion. Within its context, one can re-evaluate the basic axioms of quantum theory (as extracted, for example, from Dirac [7]). The Everettian view of the Universe is a good way to motivate exploring the effect of the environment on the state of the system. (Of course, a complementary motivation based on a non-dogmatic reading of Bohr [8] is also possible.)

The basic idea we shall pursue here is to accept a relative state explanation of the ‘collapse of the wavepacket’ by recognizing, with Everett, that observers perceive the state of the ‘rest of the Universe’ relative to their own state, or—to be more precise—relative to the state of their records. This allows quantum theory to be universally valid. (This does not mean that one has to accept a ‘many worlds’ ontology; see [3] for discussion.)

Much of the heat in various debates on the foundations of quantum theory seems to be generated by the expectation that a single idea should provide a complete solution. When this does not happen—when there is progress, but there are still unresolved issues—the possibility that an idea responsible for this progress may be a step in the right direction—but that more than one idea, one step, is needed—is often dismissed. As we shall see, developing the quantum theory of our classical everyday reality requires the solution of several problems and calls for several ideas. In order to avoid circularities, they need to be introduced in the right order.

(a). Preferred pointer states from einselection

Everett explains the perception of the collapse. However, his relative state approach raises three questions absent in Bohr’s Copenhagen interpretation [8] that relied on the independent existence of an ab initio classical domain. Thus, in a completely quantum Universe, one is forced to seek sets of preferred, effectively classical but ultimately quantum, states that can define what exists—branches of the universal state vector—and that allow observers to keep reliable records. Without such a preferred basis, relative states are just ‘too relative’, and the relative state approach suffers from basis ambiguity [9].

Decoherence selects preferred pointer states [911], so this issue was in fact resolved some time ago. The principal consequence of environment-induced decoherence is that, in open quantum systems—systems interacting with their environments—only certain quantum states retain stability in spite of the immersion of the system in the environment: superpositions are unstable, and quickly decay into mixtures of the einselected, stable pointer states [14,919]. This is einselection—a nickname for environment-induced superselection. Thus, while the significance of the environment in suppressing quantum behaviour was pointed out by Dieter Zeh already in 1970 [20], the role of einselection in the emergence of these preferred pointer states in the transition from quantum to classical has only become fully appreciated since 1981 [21].

(b). Born’s Rule from envariance

Einselection can account for preferred sets of states, and hence for Everettian ‘branches’. But this is achieved at a very high price—the usual practice of decoherence is based on averaging (as it involves reduced density matrices defined by a partial trace). This means that one is using Born’s Rule to relate amplitudes to probabilities. But, as emphasized by Everett, Born’s Rule should not be postulated in an approach that is based on purely unitary quantum dynamics. The assumption of the universal validity of quantum theory raises the issue of the origin of Born’s Rule, pk=|ψk|2, which—following the original conjecture [22]—is simply postulated in textbook discussions.

Here we shall see that Born’s Rule can be derived from entanglement-assisted invariance, or envariance—from the symmetry of entangled quantum states. Envariance is a purely quantum symmetry, as it is critically dependent on the telltale quantum feature—entanglement. Envariance sheds new light on the origin of probabilities relevant for the everyday world we live in, e.g. for statistical physics and thermodynamics. Moreover, the fundamental derivation of objective probabilities allows one to discuss information flows in our quantum Universe, and hence understand how the perception of classical reality emerges from the quantum substrate.

(c). Classical reality via quantum Darwinism

Even preferred quantum states defined by einselection are still ultimately quantum. Therefore, they cannot be found out by initially ignorant observers through direct measurement without getting disrupted (reprepared). Yet, the states of macroscopic systems in our everyday world seem to exist objectively—they can be found out by anyone without getting disrupted. This ability to find out an unknown state is in fact an operational definition of ‘objective existence’. So, if we are to explain the emergence of everyday objective classical reality, we need to identify the quantum origin of objective existence.

We shall do that by dramatically upgrading the role of the environment: in decoherence theory, the environment is the collection of degrees of freedom where quantum coherence (and hence phase information) is lost. However, in ‘real life’, the role of the environment is in effect that of a witness (e.g. [17,23]) to the state of the system of interest, and a communication channel through which the information reaches us, the observers. This mechanism for the emergence of the classical objective reality is the subject of the theory of quantum Darwinism.

2. Quantum postulates and relative states

We start from a well-defined solid ground—the list of quantum postulates that are explicit in Dirac [8], and at least implicit in most quantum textbooks.

The first two deal with the mathematics of quantum theory:

  • (i) The state of a quantum system is represented by a vector in its Hilbert space Inline graphic.

  • (ii) Evolutions are unitary (e.g. generated by the Schrödinger equation).

These two postulates provide an essentially complete summary of the mathematical structure of quantum physics. They are often [24,25] supplemented by a composition postulate:

  • (o) The states of composite quantum systems are represented by a vector in the tensor product of the Hilbert spaces of its components.

Physicists sometimes differ in assessing how much of postulate (o) follows from (i). We shall not be distracted by this issue, and move on to where the real problems are. Readers can follow their personal taste in supplementing (i) and (ii) with whatever portion of (o) they deem necessary. It is, nevertheless, useful to list (o) explicitly to emphasize the role of the tensor structure it posits: it is crucial for entanglement, the quantum phenomenon we will depend on.

Using (o), (i) and (ii), suitable Hamiltonians, etc., one can calculate. Yet, such quantum calculations are only a mathematical exercise—without additional postulates, one can predict nothing of experimental consequence from their results. What is so far missing is physics—a way to establish correspondence between abstract state vectors in Inline graphic and laboratory experiments (and/or everyday experience) is needed to relate quantum mathematics to our world.

Establishing this correspondence starts with the next postulate:

  • (iii) Immediate repetition of a measurement yields the same outcome.

Immediate repeatability is an idealization (it is hard to devise such non-demolition measurements, but it can be done). Yet postulate (iii) is uncontroversial. The notion of a ‘state’ is based on predictability, and the most rudimentary prediction is a confirmation that the state is what it is known to be. This key ingredient of quantum physics goes beyond the mathematics of postulates (o)–(ii). It enters through the repeatability postulate (iii). Moreover, a classical equivalent of (iii) is taken for granted (an unknown classical state can be discovered without getting disrupted), so repeatability does not clash with our classical intuition.

Postulate (iii) is the last uncontroversial postulate on the textbook list. This collection comprises our quantum core postulates—our credo, the foundation of the quantum theory of the classical.

In contrast to classical physics (where unknown states can be found out by an initially ignorant observer), the very next quantum axiom limits the predictive attributes of the state compared with what they were in the classical domain:

  • (iv) Measurement outcomes are limited to an orthonormal set of states (eigenstates of the measured observable). In any given run of a measurement, an outcome is just one such state.

This collapse postulate is controversial. To begin with, in a completely quantum Universe, it is inconsistent with the first two postulates: starting from a general pure state Inline graphic of the system (postulate (i)), and an initial state |A0〉 of the apparatus Inline graphic, and assuming unitary evolution (postulate (ii)), one is led to a superposition of outcomes:

2. 2.1

which is in contradiction with, at least, a literal interpretation of the ‘collapse’ anticipated by axiom (iv). This conclusion follows for an apparatus that works as intended in tests (i.e. |sk〉|A0〉⇒|sk〉|Ak〉) from linearity of quantum evolutions that is in turn implied by the unitarity of postulate (ii).

Everett settled (or at least bypassed) the ‘collapse’ part of the problem with (iv)—an observer perceives the state of the rest of the Universe relative to his/her records. This is the essence of the relative state interpretation.

However, from the standpoint of our quest for classical reality, perhaps the most significant and disturbing implication of (iv) is that quantum states do not exist—at least not in the objective sense which we are used to in the classical world. The outcome of the measurement is typically not the pre-existing state of the system, but one of the eigenstates of the measured observable.

Thus, whatever a quantum state is, ‘objective existence’ independent of what is known about it is clearly not one of its attributes. This malleability of quantum states clashes with the classical idea of what the state should be. Some even go as far as to claim that quantum states are simply a description of the information that an observer has, and have essentially nothing to do with ‘existence’.

I believe this denial of existence under any circumstances is going too far—after all, there are situations when a state can be found out, and the repeatability postulated by (iii) recognizes that its existence can be confirmed. But, clearly, (iv) limits the ‘quantum existence’ of states to situations that are ‘under the jurisdiction’ of postulate (iii) (or slightly more general situations where the pre-existing density matrix of the system commutes with the measured observable).

Collapse postulate (iv) complicates interpreting the quantum formalism, as has been appreciated since Bohr and von Neumann [8,26]. Therefore, at least before Everett, it was often cited as an indication of the ultimate insolubility of the ‘quantum measurement problem’. Yet, (iv) is hard to argue with—it captures what happens in laboratory measurements.

To resolve the clash between the mathematical structure of quantum theory and our perception of what happens in the laboratory, in real-world measurements, one can accept—with Bohr—the primacy of our experience. The inconsistency of (iv) with the mathematical core of the quantum formalism—the superpositions of (i) and the unitarity of (ii)—can then be blamed on the nature of the apparatus. According to the Copenhagen interpretation, the apparatus is classical, and, therefore, not subject to the quantum principle of superposition (which follows from (i)). Measurements straddle the quantum–classical border, so they need not abide by the unitarity of (ii). Therefore, collapse can happen in the ‘lawless’ quantum–classical no man’s land.

This quantum–classical duality posited by Bohr challenges the unification instinct of physicists. One way of viewing decoherence is to regard einselection as a mechanism that accounts for effective classicality by suspending the validity of the quantum principle of superposition in a subsystem while upholding it for the composite system that includes the environment [11,17].

Everett’s alternative to Bohr’s approach was to abandon the literal collapse and recognize that, once the observer is included in the wave function, one can consistently interpret the consequences of such correlations. The right-hand side of equation (2.1) contains all the possible outcomes, so the observer who records outcome no. 17 perceives the branch of the Universe that is consistent with that event reflected in his records. This view of the collapse is also consistent with the repeatability of postulate (iii); remeasurement by the same observer using the same (non-demolition) device yields the same outcome.

Nevertheless, this relative state view of the quantum Universe suffers from a basic problem: the principle of superposition (the consequence of axiom (i)) implies that the state of the system or of the apparatus after the measurement can be written in infinitely many unitarily equivalent basis sets in the Hilbert spaces of the apparatus (or of the observer’s memory):

2. 2.2

This is basis ambiguity [9]. It appears as soon as—with Everett—one eliminates axiom (iv). The bases employed above are typically non-orthogonal, but in the Everettian relative state setting there is nothing that would preclude them, or that would favour, for example, the Schmidt basis of Inline graphic and Inline graphic (the orthonormal basis that is unique, provided that the absolute values of the Schmidt coefficients in such a Schmidt decomposition of an entangled bipartite state differ).

In our everyday reality, we do not seem to be plagued by such basis ambiguity problems. So, in our Universe there is something that (in spite of (i) and the egalitarian superposition principle it implies) picks out preferred states, and makes them effectively classical. Axiom (iv) anticipates this.

Consequently, before there is an (apparent) collapse in the sense of Everett, a set of preferred states—one of which is selected by (or, at the very least, consistent with) the observer’s records—must be chosen. There is nothing in the writings of Everett that would even hint that he was aware of basis ambiguity and the questions it leads to.

The next question concerns probabilities: How likely is it that, after I measure, my state will be, say, Inline graphic? Everett was keenly aware of this issue, and even believed that he had solved it by deriving Born’s Rule. In retrospect, it is clear that the argument he proposed—as well as the arguments proposed by his followers, including DeWitt [24,25,27], Graham [25] and Geroch [28], who noted the failure of Everett’s original approach, and attempted to fix the problem—did not accomplish as much as was hoped for, and did not amount to a derivation of Born’s Rule (see [2931] for influential critical assessments).

In textbook versions of the quantum postulates, probabilities are assigned by another (Born’s Rule) axiom:

  • (v) The probability pk of an outcome |sk〉 in a measurement of a quantum system that was previously prepared in the state |ψ〉 is given by |〈sk|ψ〉|2.

Born’s Rule fits very well with Bohr’s approach to the quantum–classical transition (e.g. with postulate (iv)). However, Born’s Rule is at odds with the spirit of the relative state approach, or any approach that attempts (as we do) to deduce perception of the classical everyday reality starting from the quantum laws that govern our Universe. This does not mean that there is a mathematical inconsistency here: one can certainly use Born’s Rule (as the formula pk=|〈sk|ψ〉|2 is known) along with the relative state approach in averaging to get expectation values and the reduced density matrix.

Indeed, until the derivation of Born’s Rule in a framework of decoherence was proposed, decoherence practice relied on probabilities given by pk=|〈sk|ψ〉|2. They enter whenever one assigns physical interpretation to reduced density matrices, a key tool of the decoherence theory. Everett’s point was not that Born’s Rule is wrong, but, rather, that it should be derived from the other quantum postulates, and we shall show how to do that.

3. Quantum origin of quantum jumps

To restate briefly the three problems identified above, we need to derive the essence of the collapse postulate (iv) and Born’s Rule (v) from our credo—the core quantum postulates (o)–(iii). Moreover, even when we accept the relative state origin of ‘single outcomes’ and ‘collapse’, we still need to justify the emergence of the preferred basis that is the essence of (iv).

This issue (which in our summary of textbook axiomatics of quantum theory is part of the collapse postulate) is so important that it is often captured by a separate postulate which declares that ‘observables are Hermitian’. This, in effect, means that the outcomes of measurements should correspond to orthogonal states in the Hilbert space. Furthermore, we should do it without appealing to Born’s Rule—without decoherence, or at least without its usual tools such as reduced density matrices that rely on Born’s Rule. Once we have preferred states, we will also have a set of candidate events. Once we have events, we shall be able to pose questions about their probabilities.

The preferred basis problem was settled by environment-induced superselection (einselection), usually regarded as a principal consequence of decoherence. This is discussed elsewhere [9,10]. Preferred pointer states and einselection are usually justified by appealing to decoherence. Therefore, they come at a price that would have been unacceptable to Everett: decoherence and einselection employ reduced density matrices and trace, and so their predictions are based on averaging, and thus on probabilities—on Born’s Rule.

Here we present an alternative strategy for arriving at preferred states that—while not at odds with decoherence—does not rely on the Born’s Rule-dependent tools of decoherence. Our overview of the origin of quantum jumps is brief. However, we direct the reader to references where different steps of that strategy are discussed in more detail. In short, we describe how one should go about doing the necessary physics, but we only sketch what needs to be done, and we do not explain all the details—the requisite steps are carried out in the references we provide: our discussion is meant as a guide to the literature and not a substitute.

Decoherence done ‘in the usual way’ (which, by the way, is a step in the right direction, in understanding the practical, and even many of the fundamental, aspects of the quantum–classical transition!) is not a good starting point in addressing the more fundamental aspects of the origins of the classical.

In particular, decoherence is not a good starting point for the derivation of Born’s Rule. We have already noted the problem with this strategy: it courts circularity. It employs Born’s Rule to arrive at the pointer states by using the reduced density matrix which is obtained through trace—i.e. averaging, which is where Born’s Rule is implicitly invoked (e.g. [32]). So, using decoherence to derive Born’s Rule is at best a consistency check.

While I am most familiar with my own transgressions in this matter [33], this circularity also afflicts other approaches, including the proposal based on decision theory [3436], as noted also by Forrester [37] and Dawid & Thebault [38] among others. Therefore, one has to start the task from a different end.

To get anywhere—e.g. to define ‘events’ essential in the introduction of probabilities—we need to show how the mathematical structure of quantum theory (postulates (o), (i) and (ii)—Hilbert space and unitarity) supplemented by the uncontroversial postulate (iii) (immediate repeatability, hence predictability) leads to preferred sets of states.

(a). Quantum jumps from quantum core postulates

Surprisingly enough, deducing preferred states from our ‘quantum credo’ turns out to be simple. The possibility of repeated confirmation of an outcome is all that is needed to establish an effectively classical domain within the quantum Universe and to define events such as measurement outcomes.

One can accomplish this with minimal assumptions (‘quantum core’ postulates (o)–(iii) on the above list) as described in [39,40]. Here we review the basic steps. We assume that |v〉 and |w〉 are among the possible repeatably accessible outcome states of Inline graphic:

(a). 3.1a

and

(a). 3.1b

So far, we have employed postulates (i) and (iii). The measurement, when repeated, would yield the same outcome, as the pre-measurement states have not changed. Thus, postulate (iii) is indeed satisfied.

We now assume the process described by equations (3.1) is fully quantum, so postulate (ii)—unitarity of evolutions—must also apply. Unitarity implies that the overlap of the states before and after must be the same. Hence:

(a). 3.2

Our conclusions follow from this simple equation. There are two possibilities that depend on the overlap 〈v|w〉.

Suppose first that 〈v|w〉≠0. One is then forced to conclude that the measurement was unsuccessful, as the state of Inline graphic was unaffected by the process above. That is, the transfer of information from Inline graphic to Inline graphic must have failed completely, as in this case 〈Av|Aw〉=1 must hold. In particular, the apparatus can bear no imprint that distinguishes between states |v〉 and |w〉 that are not orthogonal.

The other possibility, 〈v|w〉=0, allows for an arbitrary 〈Av|Aw〉, including a perfect record, 〈Av|Aw〉=0. Thus, outcome states must be orthogonal if—in accord with postulate (iii)—they are to survive intact a successful information transfer, in general, or a quantum measurement, in particular, so that immediate remeasurement can yield the same result.

The same derivation can be carried out for Inline graphic with a Hilbert space of dimension Inline graphic starting with a system state vector Inline graphic, where (as before) a priori {|sk〉} need to be only linearly independent.

The simple reasoning above leads to a surprisingly decisive conclusion: orthogonality of the outcome states of the system is absolutely essential for them to imprint even a minute difference on the state of any other system while retaining their identity. The overlap 〈v|w〉 must be 0 exactly for 〈Av|Aw〉 to differ from unity.

Imperfect or accidental information transfers (e.g. to the environment in the course of decoherence) can also define preferred sets of states providing that the crucial non-demolition demand of postulate (iii) is imposed on the unitary evolution responsible for the information flow.

A straightforward extension of the above derivation to where it can be applied not just to measured quantum systems (where non-demolition is a tall order) but also to the measuring devices (where repeatability is essential) is possible [39,40]. It is somewhat more demanding technically, as one needs to allow for mixed states and for decoherence in a model of a presumably macroscopic apparatus, but the conclusion is the same: records maintained by the apparatus or repeatably accessible states of macroscopic but ultimately quantum systems must correspond to orthogonal subspaces of their Hilbert space.

It is important to emphasize that we are not asking for clearly distinguishable records (i.e. we are not demanding orthogonality of the states of the apparatus, 〈Av|Aw〉=0). Indeed, in the macroscopic case [40] one does not even ask for the state of the system to remain unchanged, but only for the outcomes of the consecutive measurements to be identical (i.e. the evidence of repeatability is in the outcomes). Still, even under these rather weak assumptions, one is forced to conclude that quantum states can exert distinguishable influences and remain unperturbed only when they are orthogonal. To arrive at this conclusion we only used postulate (i)—the fact that when two vectors in the Hilbert space are identical, then physical states they correspond to must also be identical.

(b). Discussion

The emergence of orthogonal outcome states is established above on the foundation of very basic (and very quantum) assumptions. It leads one to conclude that observables are indeed associated with Hermitian operators.

Hermitian observables are usually introduced in a very different manner—they are the (postulated!) quantum versions of the familiar classical quantities. This emphasizes the physical significance of their spectra (especially when they correspond to conserved quantities). Their orthogonal eigenstates emerge from the mathematics, once their Hermitian nature is assumed. Here we have deduced their Hermiticity by proving orthogonality of their eigenstates—possible outcomes—from the quantum core postulates by focusing on the effect of information transfer on the measured system.

The restriction to an orthogonal set of outcomes yields a preferred basis: the essence of the collapse axiom (iv) need not be postulated! It follows from the uncontroversial quantum core postulates (o)–(iii).

We note that the preferred basis arrived at in this manner essentially coincides with the basis obtained a long time ago via einselection [9,10]. It is just that here we have arrived at this familiar result without implicit appeal to Born’s Rule, which is essential if we want to take the next step, and derive postulate (v).

We have relied on unitarity, so we did not derive the actual collapse of the wavepacket to a single outcome—a single event. Collapse is non-unitary, so one cannot deduce it starting from the quantum core that includes postulate (ii). However, we have accounted for one of the key collapse attributes: the necessity of a symmetry breaking—of the choice of a single orthonormal set of states from amongst various possible basis sets, each of which can equally well span the Hilbert space of the system—follows from the core quantum postulates. This sets the stage for collapse—for quantum jumps.

As we have already briefly noted, this reasoning can be extended [40] to when repeatably copied states belong to a macroscopic, decohering system (e.g. an apparatus pointer). In that case a microstate can be perturbed by copying (or by the environment). What matters then is not the ‘non-demolition’ of the microstate of the pointer, but the persistence of the record its macrostate (corresponding to a whole collection of microstates) represents. To formulate this demand precisely, one can rely on repeatability of copies: for instance, even though the microstates of the pointer change upon readout due to the interaction with the environment, its macrostate should still represent the same measurement outcome—it should still contain the same ‘actionable information’ [40]. This more general discussion addresses also other issues (e.g. connection between repeatability, distinguishability and positive operator-valued measures (POVMs), raised as FAQ 4 in the frequently asked questions in §6) that arise in realistic settings (see figure 1 for the illustration of the key idea).

Figure 1.

Figure 1.

The fundamental (pre-quantum) connection between distinguishability and repeatability of measurements. The two circles represent two states of the measured system. They correspond to two outcomes—e.g. two properties of the underlying states (represented by two cross-hatchings). A measurement that can result in either outcome—that can produce a record correlated with these two properties—can be repeatable only when the two corresponding states (the two circles) do not overlap (case illustrated at the top). Repeatability is impossible without distinguishability: when two states overlap (case illustrated at the bottom), repetition of the measurement can always result in a system switching the state (and, thus, defying repeatability). In the quantum setting this pre-quantum connection between repeatability and distinguishability leads to the derivation of orthogonality of repeatable measurement outcomes (and the two cross-hatchings can be thought of as two linear polarizations of a photon—orthogonal on the top, but not below), but the basic intuition demanding distinguishability as a prerequisite for repeatability does not rely on the quantum formalism.

4. Probabilities from entanglement

The derivation of events allows, and even forces, one to enquire about their probabilities or—more specifically—about the relation between the probabilities of measurement outcomes and the initial pre-measurement state. As noted earlier, several past attempts at the derivation of Born’s Rule turned out to be circular. Here we present the key ideas behind a circularity-free approach.

We emphasize that our derivation of events does not rely on Born’s Rule. In particular, we have not attached any physical interpretation to the values of scalar products, and the key to our conclusions rests on whether the scalar product is (or is not) 0 or 1, or neither.

We now briefly review the envariant derivation of Born’s Rule based on the symmetry of entangled quantum states—on entanglement-assisted invariance or envariance. The study of envariance as a physical basis of Born’s Rule started with [17,41,42], and is now the focus of several other papers (e.g. [4345]). The key idea is illustrated in figure 2.

Figure 2.

Figure 2.

(Opposite.) Envariance—entanglement-assisted invariance—is a symmetry of entangled states. Envariance allows one to demonstrate Born’s Rule [17,41,42] using a combination of an old intuition of Laplace [47] about invariance and the origins of probability and quantum symmetries of entanglement. (a) Laplace’s principle of indifference (illustrated with playing cards) aims to establish symmetry using invariance under swaps. A player who doesn’t know the face values of cards is indifferent—does not care—if they are swapped before he gets the one on the left. For Laplace, this indifference was evidence of a (subjective) symmetry: it implied equal likelihood—equal probabilities of the invariantly swappable alternatives. For the two cards, subjective probability Inline graphic would be inferred by someone who doesn’t know their face value, but knows that one of them is a spade. When probabilities of a set of elementary events are provably equal, one can compute the probabilities of composite events and thus develop a theory of probability. Even the additivity of probabilities can be established (e.g. [48]). This is in contrast to Kolmogorov’s measure-theoretic axioms (which include additivity of probabilities). Above all, Kolmogorov’s theory does not assign probabilities to elementary events (physical or otherwise), while the envariant approach yields probabilities when the symmetries of elementary events under swaps are known. (b) The problem with Laplace’s principle of indifference is its subjectivity. The actual physical state of the system (the two cards) is altered by the swap. A related problem is that the assessment of indifference is based on ignorance. It was argued, e.g. by supporters of the relative frequency approach (regarded by many as a more ‘objective’ foundation of probability), that it is impossible to deduce anything (including probabilities) from ignorance. This (along with subjectivity) is the reason why the equal likelihood approach is regarded with suspicion as a basis of probability in physics. (c) In quantum physics, symmetries of entanglement can be used to deduce objective probabilities starting with a known state. Envariance is the relevant symmetry. When a pure entangled state of a system Inline graphic and another system we call ‘an environment Inline graphic’ (anticipating connections with decoherence) Inline graphic can be transformed by Inline graphic acting solely on Inline graphic, but the effect of Inline graphic can be undone by acting solely on Inline graphic with an appropriately chosen Inline graphic, Inline graphic, it is envariant under Inline graphic. For such composite states, one can rigorously establish that the local state of Inline graphic remains unaffected by Inline graphic. Thus, for example, the phases of the coefficients in the Schmidt expansion Inline graphic are envariant, as the effect of Inline graphic can be undone by a countertransformation Inline graphic acting solely on the environment. This envariance of phases implies their irrelevance for the local states—in effect, it implies decoherence. Moreover, when the absolute values of the Schmidt coefficients are equal, a swap Inline graphic in Inline graphic can be undone by a ‘counterswap’ Inline graphic in Inline graphic. So, as can be established more carefully [42], Inline graphic follows from the objective symmetry of such an entangled state. This proof of equal probabilities is based not on ignorance (as in Laplace’s subjective ‘indifference’) but on knowledge of the ‘wrong property’—of the global observable that rules out (via quantum indeterminacy) any information about complementary local observables. When supplemented by simple counting, envariance leads to Born’s Rule also for unequal Schmidt coefficients [17,41,42].

As we shall see, the eventual loss of coherence between pointer states can also be regarded as a consequence of quantum symmetries of the states of systems entangled with their environment. Thus, the essence of decoherence arises from the symmetries of entangled states. Indeed, some of the consequences of einselection (including the emergence of preferred states, as we have seen in the previous section) can be studied without employing the usual tools of decoherence theory (reduced density matrices and trace) that, for their physical significance, rely on Born’s Rule.

Decoherence that follows from envariance also allows one to justify the additivity of probabilities, whereas the derivation of Born’s Rule by Gleason [46] assumed it (along with the other Kolmogorov’s axioms of the measure-theoretic formulation of the foundations of probability theory, and with the Copenhagen-like setting). Appeal to symmetries leads to additivity also in the classical setting (as was noted already by Laplace [47,48]). Moreover, Gleason’s theorem (with its rather complicated proof based on ‘frame functions’ introduced especially for this purpose) provides no motivation as to why the measure he obtains should have any physical significance—i.e. why should it be regarded as a probability? As illustrated in figure 2 and discussed below, the envariant derivation of Born’s Rule has a transparent physical motivation.

The additivity of probabilities is a highly non-trivial point. In quantum theory, the overarching additivity principle is the quantum principle of superposition. Anyone familiar with the double-slit experiment knows that the probabilities of quantum states (such as the states corresponding to passing through one of the two slits) do not add, which in turn leads to interference patterns.

The presence of entanglement eliminates local phases (thus suppressing quantum superpositions, i.e. doing the job of decoherence). This leads to additivity of the probabilities of events associated with preferred pointer states.

(a). Decoherence, phases and entanglement

Decoherence is the loss of phase coherence between preferred states. It occurs when Inline graphic starts in a superposition of pointer states singled out by the interaction (represented below by the Hamiltonian Inline graphic). As in equations (3.1), states of the system leave imprints—become ‘copied’—but now Inline graphic is ‘measured’ by Inline graphic, its environment:

(a). 4.1

Equation (3.2) implied that the untouched states are orthogonal, Inline graphic. Their superposition Inline graphic turns into an entangled Inline graphic. Thus, neither Inline graphic nor Inline graphic alone have a pure state. This loss of purity signifies decoherence. One can still assign a mixed state that represents surviving information about Inline graphic to the system.

Phase changes can be detected: in a spin-Inline graphic-like Inline graphic, Inline graphic is orthogonal to Inline graphic. The phase shift operator Inline graphic alters the phase that distinguishes them: for instance, when φ=π, it converts Inline graphic to Inline graphic. In experiments Inline graphic would shift the interference pattern.

We assume perfect decoherence, Inline graphic: Inline graphic has a perfect record of pointer states. What information survives decoherence and what is lost?

Consider someone who knows the initial pre-decoherence state, Inline graphic, and would like to make predictions about the decohered Inline graphic. We now show that, when Inline graphic, the phases of α and β no longer matter for Inline graphic—phase φ has no effect on the local state of Inline graphic, so measurements on Inline graphic cannot detect a phase shift, as there is no interference pattern to shift.

Phase shift Inline graphic (acting on an entangled Inline graphic) cannot have any effect on its local state because it can be undone by Inline graphic, a ‘countershift’ acting on a distant Inline graphic decoupled from the system:

(a). 4.2

Phases in Inline graphic can be changed in a faraway Inline graphic decoupled from but entangled with Inline graphic. Therefore, they can no longer influence the local state of Inline graphic. (This follows from quantum theory alone, but is essential for causality—if they could, measuring Inline graphic would reveal this, enabling superluminal communication!)

Decoherence is caused by the loss of phase coherence. Superpositions decohere as Inline graphic and Inline graphic are recorded by Inline graphic. This is not because phases become ‘randomized’ by interactions with Inline graphic, as is sometimes said [7]. Rather, they become delocalized: they lose significance for Inline graphic alone. They are a global property of the composite state—they no longer belong to Inline graphic, so measurements on Inline graphic cannot distinguish states that started as superpositions with different phases for α and β. Consequently, information about Inline graphic is lost—it is displaced into correlations between Inline graphic and Inline graphic, and local phases of Inline graphic become a global property—global phases of the composite entangled state of Inline graphic.

We have considered this information loss here without reduced density matrices, the usual decoherence tool. Our view of decoherence appeals to symmetry, invariance of Inline graphicentanglement-assisted invariance or envariance under phase shifts of pointer state coefficients, equation (4.2). As Inline graphic entangles with Inline graphic, its local state becomes invariant under transformations that could have affected it before.

Rigorous proof of coherence loss uses quantum core postulates (o)–(iii) and relies on quantum facts 1–3:

  1. Locality: A unitary must act on a system to change its state. A state of Inline graphic that is not acted upon does not change even as other systems evolve (so Inline graphic does not affect Inline graphic even when Inline graphic and Inline graphic are entangled, in Inline graphic.

  2. The state of a system is all there is to predict measurement outcomes.

  3. A composite state determines states of subsystems (so the local state of Inline graphic is restored when the state of the whole Inline graphic and Inline graphic is restored).

These facts help to characterize local states of entangled systems without using reduced density matrices. They follow from quantum theory: locality is a property of interactions. The other two facts define the role and the relation of the quantum states of individual and composite systems in a way that does not invoke density matrices (to which we are not entitled in the absence of Born’s Rule). Thus, phase shift Inline graphic acting on a pure pre-decoherence state matters: measurement can reveal φ. In accord with facts 1 and 2, Inline graphic changes Inline graphic into Inline graphic. However, the same Inline graphic acting on Inline graphic in an entangled state Inline graphic does not matter for Inline graphic alone, as it can be undone by Inline graphic, a countershift acting on a faraway, decoupled Inline graphic. As the global Inline graphic is restored, by fact 3 the local state of Inline graphic is also restored even if Inline graphic is not acted upon (so that, by fact 1, it remains unchanged). Hence, the local state of decohered Inline graphic that one obtains from Inline graphic could not have changed to begin with, and so it cannot depend on the phases of α and β.

The only pure states invariant under such phase shifts (unaffected by decoherence) are pointer states. Resilience, as we saw, equations (2.1) and (3.1), lets them preserve correlations. For instance, the entangled state of the measured system Inline graphic and the apparatus, Inline graphic, equation (3.2), decoheres as Inline graphic interacts with Inline graphic:

(a). 4.3

The pointer states Inline graphic and Inline graphic of Inline graphic survive decoherence by Inline graphic. They retain perfect correlation with Inline graphic (or an observer, or other systems) in spite of Inline graphic, independently of the value of Inline graphic. Stability under decoherence is—in our quantum Universe—a prerequisite for effective classicality: familiar states of macroscopic objects also have to survive monitoring by Inline graphic and hence retain correlations.

Decohered Inline graphic is described by a reduced density matrix,

(a). 4.4a

When 〈ε|ε〉=0, pointer states of Inline graphic retain correlations with the outcomes:

(a). 4.4b

Both ↑ and ↓ are present: there is no ‘literal collapse’. We will use Inline graphic to examine information flows. Thus, we will need the probabilities of the outcomes.

Trace is a mathematical operation. However, regarding the reduced density matrix Inline graphic as a statistical mixture of its eigenstates—states ↑ and ↓, and A and A (pointer state) records—relies on Born’s Rule, which allows one to view tracing as averaging. We did not use it till equations (4.4) to avoid circularity. Now we derive pk=|ψk|2, Born’s Rule as we shall need it: we need to prove that the probabilities are indeed given by the eigenvalues |α|2 and |β|2 of Inline graphic. This is the postulate (v), obviously crucial for relating quantum formalism to experiments. We want to deduce Born’s Rule from the quantum core postulates (o)–(iii).

We note that this brief and somewhat biased discussion of the origin of decoherence is not a substitute for more complete presentations that employ the usual tools of decoherence theory, including in particular reduced density matrices [11,14,17]. We have, for good reason in the present context of the derivation of Born’s rule, avoided them (with the brief illustrative exception immediately above, Eq. (4.4)).

(b). Probabilities from symmetries of entanglement

In quantum physics, one seeks the probability of a measurement outcome starting from a known state of Inline graphic and a ready-to-measure state of the apparatus pointer Inline graphic. The entangled state of the whole is pure, so (at least prior to the decoherence by the environment) there is no ignorance in the usual sense.

However, envariance in a guise slightly different than before (when it accounted for decoherence) implies that mutually exclusive outcomes have certifiably equal probabilities: suppose Inline graphic starts as Inline graphic, so interaction with Inline graphic yields Inline graphic, an even (equal coefficient) state. (Here and below we skip normalization to save on notation.)

A unitary swap Inline graphic permutes states in Inline graphic:

(b). 4.5a

After the swap Inline graphic is as probable as Inline graphic was (and still is), and Inline graphic. Probabilities in Inline graphic are unchanged (as Inline graphic is untouched) so p and p must have been swapped. To prove equiprobability, we now swap records in Inline graphic:

(b). 4.5b

Swap in Inline graphic restores pre-swap Inline graphic without touching Inline graphic, so (by fact 3) the local state of Inline graphic is also restored (even though, by fact 1, it could not have been affected by the swap of equation (4.5b)). Hence (by fact 2), all predictions about Inline graphic, including probabilities, must be the same! The probabilities of Inline graphic and Inline graphic (as well as of Inline graphic and Inline graphic) are exchanged yet unchanged. Therefore, they must be equal. Thus, in our two-state case, Inline graphic. For N envariantly equivalent alternatives, pk=1/N for all k.

Getting rid of phases beforehand was crucial: swaps in an isolated pure state will, in general, change the phases, and hence will change the state. For instance, Inline graphic, after a swap Inline graphic, becomes Inline graphic, i.e. is orthogonal to the pre-swap state.

The crux of the proof of equal probabilities was that the swap does not change anything locally. This can be established for entangled states with equal coefficients but—as we have just seen—is simply not true for a pure unentangled state of just one system.

In the real world, the environment will become entangled (in the course of decoherence) with the preferred states of the system of interest (or with the preferred states of the apparatus pointer). We have already seen how postulates (i)–(iii) lead to preferred sets of states. We have also pointed out that—at least in idealized situations—these states coincide with the familiar pointer states that remain stable in spite of decoherence. So, in effect, we are using the familiar framework of decoherence to derive Born’s Rule. Fortunately, our conclusions about decoherence can be reached without employing the usual (Born’s Rule-dependent) tools of decoherence (reduced density matrix and trace).

So far, we have only explained how one can establish equality of probabilities for the outcomes that correspond to Schmidt states associated with coefficients that differ at most by a phase. This is not yet Born’s Rule. However, it turns out that this is the hard part of the proof: once such equality is established, a simple counting argument (a version of that employed in [3336]) leads to the relation between probabilities and unequal coefficients [17,41,42].

Thus, for an uneven state Inline graphic, swaps on Inline graphic and Inline graphic yield Inline graphic, and not the pre-swap state, so p and p are not equal. However, the uneven case reduces to equiprobability via fine-graining, so envariance, equations (4.4), yields Born’s Rule, Inline graphic, in general.

To see how, we take Inline graphic and Inline graphic, where μ and ν are natural numbers (so the squares of α and β are commensurate). To fine-grain, we change the basis, Inline graphic and Inline graphic, in the Hilbert space of Inline graphic:

(b). 4.6a

We simplify, and imagine an environment decohering Inline graphic in a new orthonormal basis. That is, the |ak〉 correlate with the |ek〉 so that

(b). 4.6b

as if Inline graphic were the preferred pointer states decohered by the environment so that Inline graphic.

Now swaps of Inline graphic can be undone by counterswaps of the corresponding Inline graphic. Counts of the fine-grained equiprobable Inline graphic alternatives labelled with Inline graphic or Inline graphic lead to Born’s Rule:

(b). 4.7

Amplitudes have ‘got squared’ as a result of Pythagoras’ theorem (Euclidean nature of Hilbert spaces). The case of incommensurate |α|2 and |β|2 can be settled by an appeal to the continuity of probabilities as functions of state vectors.

(c). Discussion

In physics textbooks, Born’s Rule is a postulate. Using entanglement, we have derived it here from the quantum core axioms. Our reasoning was purely quantum: knowing a state of the composite classical system means knowing the state of each part. There are no entangled classical states, and no objective symmetry to deduce classical equiprobability, the crux of our derivation. Entanglement—made possible by the tensor structure of composite Hilbert spaces, introduced by the composition postulate (o)—was key. Appeal to symmetry—subjective and suspect in the classical case—becomes rigorous thanks to objective envariance in the quantum case. Born’s Rule, introduced by textbooks as postulate (v), follows.

The relative frequency approach (found in many probability texts) starts with the count of the number of events. It has not led to successful derivation of Born’s Rule. We used entanglement symmetries to identify equiprobable alternatives. However, by employing envariance, one can also deduce the frequencies of events by considering M repetitions (i.e. Inline graphic) of an experiment and deduce departures that are also expected when M is finite. Moreover, one can even show the inverse of Born’s Rule. That is, one can demonstrate that the amplitude should be proportional to the square root of frequency [49].

As the probabilities are now in place, one can think of quantum statistical physics. One could establish its foundations using the probabilities we have just deduced. But there is an even simpler and more radical approach [50,51] that arrives at the microcanonical state without the need to invoke ensembles and probabilities. Its detailed explanation is beyond the scope of this section, but the basic idea is to regard an even state of the system entangled with its environment as the microcanonical state. This is a major conceptual simplification of the foundations of statistical physics: one can get rid of the artifice of invoking infinite collections of similar systems to represent a state of a single system in a manner that allows one to deduce relevant thermodynamic properties.1

5. Quantum Darwinism, classical reality and objective existence

Quantum Darwinism [17,23] recognizes that observers use the environment as a communication channel to acquire information about pointer states indirectly, leaving the system of interest untouched and its state unperturbed. Observers can find out the state of the system without endangering its existence (which would be inevitable in direct measurements). Indeed, the reader of this text is—at this very moment—intercepting a tiny fraction of the photon environment by his eyes to gather all of the information he needs.

This is how virtually all of our information is acquired. A direct measurement is not what we do. Rather, we count on redundancy and settle for information that exists in many copies. This is how objective existence—the cornerstone of classical reality—arises in the quantum world.

(a). Mutual information in quantum correlations

To develop the theory of quantum Darwinism, we need to quantify information between fragments of the environment and the system. Mutual information is a convenient tool that we shall use for this purpose.

The mutual information between the system Inline graphic and a fragment Inline graphic (that will play the role of the apparatus Inline graphic of equations (4.4) in the discussion above) can be computed using the density matrices of the systems of interest using their von Neumann entropies Inline graphic:

(a). 5.1

We have used the density matrices of Inline graphic and Inline graphic (as a ‘stand-in’ for Inline graphic) from equations (4.4) to obtain the specific value of mutual information above.

We have already noted the special role of the pointer observable. It is stable and hence it leaves behind information-theoretic progeny—multiple imprints, copies of the pointer states—in the environment. By contrast, complementary observables are destroyed by the interaction with a single subsystem of Inline graphic. They can, in principle, still be accessed, but only when all of the environment is measured. Indeed, because we are dealing with a quantum system, things are much worse than that: the environment must be measured in precisely the right (typically global) basis to allow for such a reconstruction. Otherwise, the accumulation of errors over multiple measurements will lead to an incorrect conclusion and re-prepare the state and environment, so that it is no longer a record of the state of Inline graphic, and phase information is irretrievably lost.

(b). Objective reality from redundant information

Quantum Darwinism was introduced relatively recently. Previous studies of the records ‘kept’ by the environment were focused on its effect on the state of the system, and not on their utility. Decoherence is a case in point, as are some of the studies of the decoherent histories approach [56,57]. The exploration of quantum Darwinism in specific models was started at the beginning of this millenium [5862]. We do not intend to review all of the results obtained to date in detail. The basic conclusion of these studies is, however, that the dynamics responsible for decoherence is also capable of imprinting multiple copies of the pointer basis on the environment. Moreover, while decoherence is always implied by quantum Darwinism, the reverse need not be true. One can easily imagine situations where the environment is completely mixed, and thus cannot be used as a communication channel, but would still suppress quantum coherence in the system.

For many subsystems, Inline graphic, the initial state Inline graphic evolves into a ‘branching state’:

(b). 5.2

Linearity assures all branches persist: collapse to one outcome is not in the cards. However, large Inline graphic can disseminate information about the system. The state Inline graphic represents many records inscribed in its fragments, collections of subsystems of Inline graphic (figure 3). This means that the state of Inline graphic can be found out by many, independently and indirectly—hence, without disturbing Inline graphic. This is how evidence of objective existence arises in our quantum world.

Figure 3.

Figure 3.

Quantum Darwinism recognizes that environments consist of many subsystems and that observers acquire information about the system of interest Inline graphic by intercepting copies of its pointer states deposited in Inline graphic as a result of decoherence. (a) Decoherence paradigm: universe is divided into systemand environment. (b,c) Quantum Darwinism: environment consists of elementary subsystems—subenvironments. The latter can be combined into fragments that each have nearly complete information about the system. Redundancy is the number of such fragments. (Online version in colour.)

An environment fragment Inline graphic can act as apparatus with a (possibly incomplete) record of Inline graphic. When Inline graphic (the rest of the Inline graphic) is traced out, Inline graphic decoheres, and the reduced density matrix describing the joint state of Inline graphic and Inline graphic is

(b). 5.3

When Inline graphic, Inline graphic contains a perfect record of the preferred states of the system. In principle, each subsystem of Inline graphic may be enough to reveal its state, but this is unlikely. Typically, one must collect many subsystems of Inline graphic into Inline graphic to find out about Inline graphic.

The redundancy of the data about pointer states in Inline graphic determines how many times the same information can be independently extracted—it is a measure of objectivity. The key question of quantum Darwinism is then: How many subsystems of Inline graphic—what fraction of Inline graphic—does one need to find out about Inline graphic? The answer is provided by the mutual information Inline graphic, information about Inline graphic available from Inline graphic, fraction Inline graphic of Inline graphic (where Inline graphic and Inline graphic are the numbers of subsystems).

In the case of perfect correlation, a single subsystem of Inline graphic would suffice, as Inline graphic jumps to Inline graphic at Inline graphic. The data in additional subsystems of Inline graphic are then redundant. Usually, however, larger fragments of Inline graphic are needed to find out enough about Inline graphic. The red line in figure 4 illustrates this: Inline graphic still approaches Inline graphic, but only gradually. The length of this plateau can be measured in units of fδ, the initial rising portion of Inline graphic. It is defined with the help of the information deficit δ that observers tolerate:

(b). 5.4

Redundancy is the number of such records of Inline graphic in Inline graphic:

(b). 5.5

Inline graphic sets the upper limit on how many observers can find out the state of Inline graphic from Inline graphic independently and indirectly. In models [5865] (especially photon scattering analysed by extending the decoherence model of Joos & Zeh [66]) ℛδ is huge [6365] and depends on δ only weakly (logarithmically).

Figure 4.

Figure 4.

Information about the system contained in a fraction f of the environment. Red plot (with plateau) shows a typical Inline graphic established by decoherence. The rapid rise means that nearly all classically accessible information is revealed bya small fraction of Inline graphic. It is followed by a plateau: additional fragments only confirm what is already known. Redundancy Inline graphic is the number of such independent fractions. Green plot shows Inline graphic for a random state in the composite system Inline graphic. (Online version in colour.)

This is ‘quantum spam’: Inline graphic imprints of pointer states are broadcast through the environment. Many observers can access them independently and indirectly, assuring objectivity of pointer states of Inline graphic. Repeatability is key: states must survive copying to produce many imprints.

(c). Discussion

Our discussion of quantum jumps shows when, in spite of the no-cloning theorem [67,68], repeatable copying is possible. Discrete preferred states set the stage for quantum jumps. Copying yields branches of records inscribed in subsystems of Inline graphic. Initial superposition yields superposition of branches, equation (5.2), so there is no literal collapse. However, fragments of Inline graphic can reveal only one branch (and not their superposition). Such evidence will suggest a ‘quantum jump’ from superposition to a single outcome, in accord with (iv).

Not all environments are good in this role of a witness. Photons excel: they do not interact with the air or with each other, faithfully passing on information. A small fraction of the photon environment usually reveals all we need to know. Scattering of sunlight quickly builds up redundancy: a 1μ dielectric sphere in a superposition of 1μ size increases Inline graphic by approximately 108 every microsecond [63,64]. The mutual information plot illustrating this case is shown in figure 5.

Figure 5.

Figure 5.

The quantum mutual information Inline graphic versus fragment size f at different elapsed times for an object illuminated by point-source black-body radiation. Individual curves are labelled by the time t in units of the decoherence time τD. For tτD (red dashed lines), the information about the system available in the environment is low. The linearity in f means each piece of the environment contains new, independent information. For t>τD (blue solid lines), the shape of the partial information plot indicates redundancy; the first few pieces of the environment increase the information, but additional pieces only confirm what is already known. (Online version in colour.)

Air is also good in decohering, but its molecules interact, scrambling acquired data. Objects of interest scatter both air and photons, so both acquire information about position, and favour similar localized pointer states.

Quantum Darwinism shows why it is so hard to undo decoherence [69]. Plots of mutual information Inline graphic for initially pure Inline graphic and Inline graphic are antisymmetric (figure 4) around Inline graphic and Inline graphic [58]. Hence, a counterpoint of the initial quick rise at ffδ is a quick rise at f≥1−fδ, as the last few subsystems of Inline graphic are included in the fragment Inline graphic that by now contains nearly all Inline graphic. This is because an initially pure Inline graphic remains pure under unitary evolution, so Inline graphic, and Inline graphic must reach Inline graphic. Thus, a measurement on all of Inline graphic could confirm its purity in spite of decoherence caused by Inline graphic for all f≤1−fδ. However, to verify this, one has to intercept and measure all of Inline graphic in a way that reveals pure state Inline graphic, equation (5.2). Other measurements destroy phase information. So, undoing decoherence is in principle possible, but the required resources and foresight preclude it.

In quantum Darwinism, a decohering environment acts as an amplifier, inducing branch structure of Inline graphic distinct from typical states in the Hilbert space of Inline graphic: Inline graphic of a random state is given by the green line in figure 4, with no plateau or redundancy. Antisymmetry means that Inline graphic ‘jumps’ at Inline graphic to Inline graphic.

Environments that decohere Inline graphic, but scramble information because of interactions between subsystems (e.g. air), eventually approach such random states. Quantum Darwinism is possible only when information about Inline graphic is preserved in fragments of Inline graphic, so that it can be recovered by observers. There is no need for perfection: partially mixed environments or imperfect measurements correspond to noisy communication channels: their capacity is depleted, but we can still get the message [70,71].

Quantum Darwinism settles the issue of the origin of classical reality by accounting for all of the operational symptoms of objective existence in a quantum Universe: a single quantum state cannot be found out through a direct measurement. However, pointer states usually leave multiple records in the environment. Observers can use these records to find out the (pointer) state of the system of interest. Observers can afford to destroy photons while reading the evidence—the existence of multiple copies implies that other observers can access the information about the system indirectly and independently, and that they will all agree about the outcome. This is, I believe, how objective existence arises in our quantum world.2

6. Discussion: frequently asked questions

The subject of this paper has a long history. As a result, there are different ways of talking, thinking and writing about it. It is almost as if different points of view have developed different languages. As a result, one may find it difficult to understand the ideas, as one often has to learn ‘the other language’ used to discuss the same problem. This is further complicated by the fact that all of these languages use essentially the same words, but charged with very different meanings. Concepts like ‘existence’, ‘reality’ or ‘state’ are good examples.

The aim of this section is to acknowledge this problem and to deal with it to the extent possible within the framework of a brief guide. We shall do that in a way inspired by the modern approach to languages (and to travel guides): rather than study vocabulary and grammar, we shall use ‘conversations’ based on a few ‘frequently asked questions’ (FAQs). The hope is that this exercise will provide the reader with some useful hints of what is meant by certain phrases. This is very much in the spirit of the ‘travel guide’, where a collection of frequently used expressions is often included.

FAQ 1: What is the difference between ‘decoherence’ and ‘einselection’?

Decoherence is the process of the loss of phase coherence caused by the interaction between the system and the environment. Einselection is an abbreviation of ‘environment-induced superselection’, which designates the selection of a preferred set of pointer states that are immune to decoherence. Decoherence will often (but not always) result in einselection. For instance, an interaction that commutes with a certain observable of a system will preserve eigenstates of that pointer observable, pointer states that are einselected, and do not decohere. By contrast, superpositions of such pointer states will decohere. This picture can be (and generally will be) complicated by the evolution induced by the Hamiltonian of the system, so that perfect pointer states will not exist, but approximate pointer states will still be favoured—will be much more stable than their superpositions. There are also cases when there is decoherence, but it treats all the states equally badly, so that there is no einselection, and there are no pointer states. A perfect depolarizing channel [32] is an example of such decoherence that does not lead to einselection. Section 3 of this paper emphasizes the connection between predictability and einselection, and leads to a derivation of preferred states that does not rely on Born’s Rule.

FAQ 2: Why does axiom (iv) conflict with the ‘objective existence’ of quantum states?

The criterion for objective existence used here is pragmatic and operational: finding out a state without prior knowledge is a necessary condition for a state to objectively exist [17,5862]. Classical states are thought to exist in this sense. Quantum states do not: quantum measurement yields an outcome—but, according to axiom (iv), this is one of the eigenstates of the measured observable, and not a pre-existing state of the system. Moreover, according to axiom (iii) (or the collapse part of (iv)), measurement re-prepares the system in one of the eigenstates of the measured observable. A sufficient condition for objective existence is the ability of many observers to independently find out the state of the system without prior knowledge, and to agree about it. Quantum Darwinism makes this possible.

FAQ 3: What is the relation between the preferred states derived using their predictability (axiom (iii)) in §3 and the familiar ‘pointer states’ obtained from einselection?

In the idealized case (e.g. when perfect pointer states exist), the two sets of states are necessarily the same. This is because the key requirement (stability, in spite of the monitoring/copying by the environment or an apparatus) that was used in the original definition of pointer states in [9] is essentially identical to ‘repeatability’—the key ingredient of axiom (iii). It follows that, when interactions commute with certain observables (e.g. because they depend on them), these observables are constants of motion under such an interaction Hamiltonian, and they will be left intact. For example, interactions that depend on position will favour (einselect) localized states and destroy (decohere) non-local superpositions. Using a predictability sieve to implement einselection [13,17,19,44] is a good way to appreciate this.

FAQ 4: Repeatability of measurements, axiom (iii), seems to be a very strong assumption. Can it be relaxed (e.g. to include POVMs)?

Non-demolition measurements are very idealized (and hard to implement). In the interest of brevity, we have imposed a literal reading of axiom (iii). This is very much in the spirit of Dirac’s textbook, but it is also more restrictive than necessary [39], and does not cover situations that arise most often in the context of laboratory measurements. All that is needed in practice is that the record made in the apparatus (e.g. the position of its pointer) must be ‘repeatably accessible’. Frequently, one does not care about repeated measurements of the quantum system (which may even be destroyed in the measurement process). Axiom (iii) captures in fact the whole idea of a record—it has to persist in spite of being read, copied, etc. So one can impose the requirement of repeatability at the macroscopic level of an apparatus pointer with a much better physical justification than Dirac did for the microscopic measured system. The proof of §3 then goes through essentially as before, but the details (and how far one can take the argument) depend on specific settings. This ‘transfer of the responsibility for repeatability’ from the quantum system to a (still quantum, but possibly macroscopic) apparatus allows one to incorporate non-orthogonal measurement outcomes (such as POVMs) very naturally: the apparatus entangles with the system and then acts as an ancilla in the usual projective measurement implementation of POVMs (e.g. [32]).

FAQ 5: Probabilities—why do they enter? One may even say that, in the Everettian setting, ‘everything happens’, so why are they needed and what do they refer to?

Axiom (iii) interpreted in the relative state sense ‘does the job’ of the collapse part of axiom (iv). That is, when an observer makes a measurement of an observable, he will record an outcome. Repetition of that measurement will confirm his previous record. That leads to the symmetry breaking derived in §3 and captures the essence of the ‘collapse’ in the relative state setting [17,39]. So, when an observer is about to measure a state (e.g. prepared previously by another measurement), he knows that there are as many possible outcomes as there are eigenvalues of the measured observable, but that he will end up recording just one of them. Thus, even if ‘everything happens’, a specific observer would remember a specific sequence of past events that happened to him. The question about the probability of an outcome—a future event that is about to happen—is then natural, and it is most naturally posed in this ‘just before the measurement’ setting. The concept of probability does not (need not!) concern alternatives that already exist (as in classical discussions of probability, or some ‘many worlds’ discussions). Rather [42,78], it concerns future potential events, one of which will become a reality upon a measurement.

FAQ 6: Derivation of Born’s Rule here and in [3,17,41,42], and even derivation of the orthogonality of outcome states, use scalar products. But scalar product appears in Born’s Rule. Isn’t that circular?

Scalar product is an essential part of the mathematics of quantum theory. Derivation of Born’s Rule relates probabilities of various outcomes to amplitudes of the corresponding states using symmetries of entanglement. So it provides a connection between the mathematics of quantum theory and experiments—physics. Hilbert space (with the scalar product) is certainly an essential part of the input. And so are entangled states and entangling interactions. They appear whenever information is transferred between systems (e.g. in measurements, but also as a result of decoherence). All derivations proceed in such a way that only two values of the scalar product—0 and 1—are used as input. Both correspond to certainty.

FAQ 7: How can one infer probability from certainty?

Symmetry is the key idea. When there are several (say, n) mutually exclusive events that are a part of a state invariant under their swaps, their probabilities must be equal. When these events exhaust all the possibilities, the probability of any one of them must be 1/n. In contrast to the classical case discussed by Laplace, the tensor nature of states of composite quantum systems allows one to exhibit objective symmetries [3,17,41,42]. Thus, one can dispense with Laplace’s subjective ignorance (his ‘principle of indifference’), and work with objective symmetries of entangled states. The key to the derivation of probabilities are the proofs: (i) that the phases of Schmidt coefficients do not matter (this amounts to decoherence, but is established without the reduced density matrix and partial trace, the usual Born’s Rule-dependent tools of decoherence theory); and (ii) that equal amplitudes imply equal probabilities. Both proofs [3,17,41,42] are based on entanglement-assisted invariance (or envariance). This symmetry allows one to show that certain (Bell state-like) entangled states of the whole imply equal probabilities for local states. This is done using symmetry and certainty as basic ingredients. In particular, one relies on the ability to undo the effect of local transformations (such as a ‘swap’) by acting on another part of the composite system, so that the pre-existing state of the whole is recovered with certainty. Using envariance, one can even show that an amplitude of 0 necessarily implies a probability of 0 (i.e. impossibility) of the corresponding outcome.3 One can also prove additivity of probabilities [42] using a modest assumption—the fact that the probabilities of an event and its complement sum up to 1.

FAQ 8: Why are the probabilities of two local states in a Bell-like entangled state equal? Is the invariance under relabelling of the states the key to the proof?

Envariance is needed precisely because relabelling is not enough. For instance, states can have intrinsic properties that they ‘carry’ with them even when they get relabelled. Thus, a superposition of ground and excited states Inline graphic is invariant under relabelling, but this does not change the fact that the energy of the ground state Inline graphic is less than the energy of the excited state Inline graphic. So there may be intrinsic properties of quantum states (such as energy) that ‘trump’ relabelling, and it is a priori possible that probability is like energy in this respect. This is where envariance saves the day. To see this, consider a Schmidt decomposition of an entagled state Inline graphic, where the first ket belongs to Inline graphic and the second to Inline graphic. The probabilities of Schmidt partners must be equal, Inline graphic and Inline graphic. (This ‘makes sense’, but can be established rigorously, e.g. by showing that the amplitude of Inline graphic vanishes in the state left after a projective measurement that yields Inline graphic on Inline graphic.) Moreover, after a swap Inline graphic, in the resulting state Inline graphic, one has Inline graphic and Inline graphic. But the probabilities in the environment Inline graphic (which was not acted upon by the swap) could not have changed. It, therefore, follows that Inline graphic, where the last equality assumes (the usual) normalization of probabilities with p(certain event)=1.

FAQ 9: Probabilities are often justified by counting, as in the relative frequency approach. Is counting involved in the envariant approach?

There is a sense in which the envariant approach is based on counting, but one does not count the actual events (as is done in statistics) or members of an imaginary ensemble (as is done in the relative frequency approach) but, rather, one counts the number of potential invariantly swappable (and, hence, equiprobable) mutually exclusive events. Relative frequency statistics can be recovered (very much in the spirit of Everett) by considering branches in which a certain number of events of interest (e.g. detections of Inline graphic, ‘spin up’, etc.) has occurred. This allows one to quantify probabilities in the resulting fragment of the ‘multiverse’, with all of the branches, including the ‘maverick’ branches that have proved so difficult to handle in the past [24,25,2731]. They are still there (as they certainly have every right to be!) but appear with probabilities that are very small, as can be established using envariance [42]. These branches need not be ‘real’ to do the counting—as before, it is quite natural to ask about probabilities before finding out (measuring) what actually happened.

FAQ 10: What is the ‘existential interpretation’? How does it relate to the ‘many worlds interpretation’?

The existential interpretation is an attempt to let quantum theory tell us how to interpret it by focusing on how effectively classical states can emerge from within our Universe, that is ‘quantum to the core’. Decoherence was a major step in solving this problem: it demonstrated that in open quantum systems only certain states (selected with the help of the environment that monitors such systems) are stable. They can persist, and therefore—in that very operational and ‘down to earth’ sense—exist. Results of decoherence theory (such as einselection and pointer states) are interpretation-independent. But decoherence was not fundamental enough—it rested on assumptions (e.g. Born’s Rule) that were unnatural for a theory that aims to provide a fundamental view of the origin of the classical realm starting with unitary quantum dynamics. Moreover, it did not go far enough: Einselection focused on the stability of states in the presence of environment, but it did not address the question of what states can survive measurement by the observer and why. The developments described briefly in this ‘guide’ go in both directions. Axiom (iii) that is central in §3 focuses on repeatability (which is another symptom of persistence, and hence existence). The events it defines provide a motivation (and a part of the input) for the derivation of Born’s Rule sketched in §4. These two sections shore up ‘foundations’. Quantum Darwinism explains why states einselected by decoherence are detected by observers. Thus, it reaffirms the role of einselection by showing that pointer states are usually reproduced in many copies in the environment, and that observers find out the state of the system indirectly, by intercepting fragments of the environment (which now plays a role of the communication channel). These advances rely on unitary evolutions and Everett’s ‘relative state’ view of the collapse. However, none of these advances depends on adopting the orthodox ‘many worlds’ point of view, where each of the branches is ‘equally real’.

7. Conclusion

The advances discussed in this paper include a derivation of preferred pointer states (key to postulate (iv)) that does not rely on the usual tools of decoherence, the envariant derivation of probabilities (postulate (v)) and quantum Darwinism. Taken together, and in the right order, they show how the classical domain of our experience emerges from the quantum substrate. They complete what I term the existential interpretation based on the operational definition of objective existence, and justify confidence in quantum mechanics as the ultimate theory that needs no modifications to account for the emergence of the classical.

Of the three advances mentioned above, we have summed up the main idea of the first (the quantum origin of quantum jumps), provided an illustration of the second (the envariant origin of Born’s Rule), and briefly explained quantum Darwinism.

Everett’s insight—the realization that relative states settle the problem of collapse—was the key to these developments (and to progress in understanding fundamental aspects of decoherence). But it is important to be careful in specifying what exactly we need from Everett and his followers, and what can be left behind. There is no doubt that the concept of relative states is crucial. Perhaps even more important is the idea that one can apply quantum theory to anything—that there is nothing ab initio classical. But the combination of these two ideas does not yet force one to adopt a ‘many worlds interpretation’ in which all of the branches are equally real.

Quantum states combine ontic and epistemic attributes. They cannot be ‘found out’, so they do not exist in the same robust measurement-independent sense classical states were thought to exist. But once they are known, their existence can be confirmed. This interdependence of existence and information brings to mind two contributions of John Wheeler: his early assessment of the relative states interpretation (which he saw as an extension of Bohr’s ideas) [79], and also his ‘it from bit’ programme [80] (where information was the source of existence).

This interdependence of existence and information was very much in evidence in this paper. Stability, in spite of the (deliberate or accidental) information transfer, led to preferred pointer states, and is the essence of einselection. Entanglement deprives local states of information (which is transferred to correlations) and forces one to describe these local states in probabilistic terms, leading to Born’s Rule. Robust existence emerges (‘it from many bits’, to paraphrase Wheeler) through quantum Darwinism. The selective proliferation of information makes it immune to measurements, and allows einselected states to be found out indirectly—without endangering their existence.

Acknowledgements

I thank C. Jess Riedel and, especially, Michael Zwolak for stimulating discussions.

Footnotes

1

We note that envariance has been successfully tested in several recent experiments [5255].

2

There has been significant progress in the study of the acquisition and dissemination of information by environments [7277]. More detailed discussion of the results obtained in these papers is, unfortunately, beyond the scope of our brief review.

3

This is because, in a Schmidt decomposition that contains n such states with zero coefficients, one can always combine two of them to form a new state, which then appears with the other n−2 states, still with the amplitude of 0. This purely mathematical step should have no implications for the probabilities of the n−2 states that were not involved. Yet, there are now only n−1 states with equal coefficients. So the probability w of any state with zero amplitude has to satisfy nw=(n−1)w, which holds only for w=0 [3].

Data accessibility

This article has no additional data.

Competing interests

I declare I have no competing interests.

Funding

This research was funded by DoE through LDRD grant at Los Alamos, and, in part, by FQXi.

References

  • 1.Zurek WH. 2009. Quantum Darwinism. Nat. Phys. 5, 181–188. ( 10.1038/nphys1202) [DOI] [Google Scholar]
  • 2.Zurek WH. 2014. Quantum Darwinism, classical reality, and the randomness of quantum jumps. Phys. Today 67, 44–50. ( 10.1063/PT.3.2550) [DOI] [Google Scholar]
  • 3.Zurek WH.2007. Relative states and the environment: einselection, envariance, quantum darwinism, and the existential interpretation. (https://arxiv.org/abs/0707.2832. )
  • 4.Zurek WH. In preparation Decoherence, quantum Darwinism, and the quantum theory of the classical. [Google Scholar]
  • 5.Everett H., III 1957. ‘Relative state’ formulation of quantum mechanics. Rev. Mod. Phys. 29, 454–462. ( 10.1103/RevModPhys.29.454) [DOI] [Google Scholar]
  • 6.Everett H., III1957. The theory of the universal wave function. PhD dissertation, Princeton University. (Reprinted in ref. 25.)
  • 7.Dirac PAM. 1958. Quantum mechanics. Oxford, UK: Clarendon Press. [Google Scholar]
  • 8.Bohr N. 1928. The quantum postulate and the recent development of atomic theory. Nature 121, 580–590. ( 10.1038/121580a0) [DOI] [Google Scholar]
  • 9.Zurek WH. 1981. Pointer basis of quantum apparatus: into what mixture does the wave packet collapse? Phys. Rev. D 24, 1516 ( 10.1103/PhysRevD.24.1516) [DOI] [Google Scholar]
  • 10.Zurek WH. 1982. Environment-induced superselection rules. Phys. Rev. D 26, 1862 ( 10.1103/PhysRevD.26.1862) [DOI] [Google Scholar]
  • 11.Zurek WH. 1991. Decoherence and the transition from quantum to classical. Phys. Today 44, 36 ( 10.1063/1.881293) [DOI] [Google Scholar]
  • 12.Zeh HD. 1990. Quantum mechanics and algorithmic complexity. In Complexity, entropy, and the physics of information (ed. WH Zurek), pp. 405–422. Redwood City, CA: Addison Wesley.
  • 13.Paz J-P, Zurek WH. 2001. Environment-induced decoherence and the transition from quantum to classical. In Coherent atomic matter waves, Les Houches lectures (eds R Kaiser, C Westbrook, F David), pp. 533–614. Berlin, Germany: Springer ( 10.1007/3-540-45338-5_8) [DOI]
  • 14.Joos E, Zeh HD, Kiefer C, Giulini D, Kupsch J, Stamatescu I-O. 2003. Decoherence and the appearance of a classical world in quantum theory. Berlin, Germany: Springer; ( 10.1007/978-3-662-05328-7) [DOI] [Google Scholar]
  • 15.Zurek WH. 1993. Preferred states, predictability, classicality and the environment-induced decoherence. Prog. Theor. Phys. 89, 281–312. ( 10.1143/ptp/89.2.281) [DOI] [Google Scholar]
  • 16.Zurek WH. 1998. Decoherence, chaos, quantum–classical correspondence, and the algorithmic arrow of time. Phys. Scr. T76, 186–198. ( 10.1238/Physica.Topical.076a00186) [DOI] [Google Scholar]
  • 17.Zurek WH. 2003. Decoherence, einselection, and the quantum origins of the classical. Rev. Mod. Phys. 75, 715–775. ( 10.1103/RevModPhys.75.715) [DOI] [Google Scholar]
  • 18.Schlosshauer M. 2004. Decoherence, the measurement problem, and interpretations of quantum mechanics. Rev. Mod. Phys. 76, 1267–1305. ( 10.1103/RevModPhys.76.1267) [DOI] [Google Scholar]
  • 19.Schlosshauer M. 2007. Decoherence and the quantum-to-classical transition. Berlin, Germany: Springer; ( 10.1007/978-3-540-35775-9) [DOI] [Google Scholar]
  • 20.Zeh HD. 1970. On the interpretation of measurement in quantum theory. Found. Phys. 1, 69–76. ( 10.1007/BF00708656) [DOI] [Google Scholar]
  • 21.Zeh HD. 2006. Roots and fruits of decoherence. In Quantum decoherence (eds B Duplantier, J-M Raimond, V Rivasseau), pp. 151–175. Basel, Switzerland: Birkhäuser.
  • 22.Born M. 1926. Zur Quantenmechanik der Stoßvorgänge. Z. Phys. 37, 863–867. ( 10.1007/BF01397477) [DOI] [Google Scholar]
  • 23.Zurek WH. 2000. Einselection and decoherence from an information theory perspective. Ann. Physik (Leipzig) 9, 855–864. (http://dx.doi.org/doi:10.1002/1521-3889(200011)9:11/12<855::AID-ANDP855>3.0.CO;2-K) [Google Scholar]
  • 24.DeWitt BS.1971. The many-universes interpretation of quantum mechanics. In Foundations of quantum mechanics (ed. B d’Espagnat), pp. 211–262. New York, NY: Academic Press. (Reprinted in ref. 25.)
  • 25.DeWitt BS, Graham N. 1973. The many-worlds interpretation of quantum mechanics. Princeton, NJ: Princeton University Press. [Google Scholar]
  • 26.von Neumann J. 1932. Mathematical foundations of quantum theory. Translated from German original by R. T. Beyer (Princeton University Press, Princeton, NJ, 1955).
  • 27.DeWitt BS. 1970. Quantum mechanics and reality. Phys. Today 23, 30–35. ( 10.1063/1.3022331) [DOI] [Google Scholar]
  • 28.Geroch R. 1984. The Everett interpretation. Noûs 18, 617–633. ( 10.2307/2214880) [DOI] [Google Scholar]
  • 29.Squires EJ. 1990. On an alleged ‘proof’ of the quantum probability law. Phys. Lett. A 145, 67–68. ( 10.1016/0375-9601(90)90192-Q) [DOI] [Google Scholar]
  • 30.Stein H. 1984. The Everett interpretation of quantum mechanics: many worlds or none? Noûs 18, 635–652. ( 10.2307/2214881) [DOI] [Google Scholar]
  • 31.Kent A. 1990. Against many-worlds interpretations. Int. J. Mod. Phys. A5, 1745–1762. ( 10.1142/S0217751X90000805) [DOI] [Google Scholar]
  • 32.Nielsen MA, Chuang IL. 2000. Quantum computation and quantum information. Cambridge, UK: Cambridge University Press. [Google Scholar]
  • 33.Zurek WH. 1998. Decoherence, einselection and the existential interpretation (the rough guide). Phil. Trans. R. Soc. Lond. A 356, 1793–1821. ( 10.1098/rsta.1998.0250) [DOI] [Google Scholar]
  • 34.Deutsch D. 1999. Quantum theory of probability and decisions. Proc. R. Soc. Lond. A 455, 3129–3137. ( 10.1098/rspa.1999.0443) [DOI] [Google Scholar]
  • 35.Wallace D. 2003. Everettian rationality: defending Deutsch’s approach to probability in the Everett interpretation. Stud. Hist. Philos. Mod. Phys. 34, 415–439. ( 10.1016/S1355-2198(03)00036-4) [DOI] [Google Scholar]
  • 36.Saunders S. 2004. Derivation of the Born rule from operational assumptions. Proc. R. Soc. Lond. A 460, 1771–1788. ( 10.1098/rspa.2003.1232) [DOI] [Google Scholar]
  • 37.Forrester A. 2007. Decision theory and information propagation in quantum physics. Stud. Hist. Philos. Mod. Phys. 38, 815–831. ( 10.1016/j.shpsb.2007.02.004) [DOI] [Google Scholar]
  • 38.Dawid R, Thebault KPY. 2015. Many worlds: decoherent or incoherent? Synthese 192, 1559–1580. ( 10.1007/s11229-014-0650-8) [DOI] [Google Scholar]
  • 39.Zurek WH. 2007. Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer in the transition from quantum to classical. Phys. Rev. A 76, 052110 ( 10.1103/PhysRevA.76.052110) [DOI] [Google Scholar]
  • 40.Zurek WH. 2013. Wave-packet collapse and the core quantum postulates: discreteness of quantum jumps from unitarity, repeatability, and actionable information. Phys. Rev. A 87, 052111 ( 10.1103/PhysRevA.87.052111) [DOI] [Google Scholar]
  • 41.Zurek WH. 2003. Environment-assisted invariance, entanglement, and probabilities in quantum physics. Phys. Rev. Lett. 90, 120404 ( 10.1103/PhysRevLett.90.120404) [DOI] [PubMed] [Google Scholar]
  • 42.Zurek WH. 2005. Probabilities from entanglement, Born’s rule pkg234=|ψk|2 from envariance. Phys. Rev. A 71, 052105 ( 10.1103/PhysRevA.71.052105) [DOI] [Google Scholar]
  • 43.Schlosshauer M, Fine A. 2005. On Zurek’s derivation of the Born rule. Found. Phys. 35, 197–213. ( 10.1007/s10701-004-1941-6) [DOI] [Google Scholar]
  • 44.Barnum H.2003. No-signalling-based version of Zurek’s derivation of quantum probabilities: a note on ‘Environment-assisted invariance, entanglement, and probabilities in quantum physics’. (http://arxiv.org/abs/quant-ph/0312150. )
  • 45.Herbut F. 2007. Quantum probability law from ‘environment-assisted invariance’ in terms of pure-state twin unitaries. J. Phys. A 40, 5949–5971. ( 10.1088/1751-8113/40/22/013) [DOI] [Google Scholar]
  • 46.Gleason AM. 1957. Measures on the closed subspaces of a Hilbert space. J. Math. Mech. 6, 885–893. ( 10.1512/iumj.1957.6.56050) [DOI] [Google Scholar]
  • 47.Laplace PS. 1820. A philosophical essay on probabilities. English translation of the French original by F. W. Truscott and F. L. Emory (Dover, New York, 1951).
  • 48.Gnedenko BV. 1968. The theory of probability. New York, NY: Chelsea. [Google Scholar]
  • 49.Zurek WH. 2011. Entanglement symmetry, amplitudes, and probabilities: inverting Born’s rule. Phys. Rev. Lett. 106, 250402 ( 10.1103/PhysRevLett.106.250402) [DOI] [PubMed] [Google Scholar]
  • 50.Deffner S, Zurek WH. 2016. Foundations of statistical mechanics from symmetries of entanglement. New J. Phys. 18, 063013 ( 10.1088/1367-2630/18/6/063013) [DOI] [Google Scholar]
  • 51.Zurek WH. In press Eliminating ensembles from equilibrium statistical physics: Maxwell’s demon, Szilard’s engine, and thermodynamics via entanglement. Phys. Rep. [Google Scholar]
  • 52.Vermeyden L, Ma X, Lavoie J, Bonsma M, Sinha U, Laflamme R, Resch KJ. 2015. Experimental test of envariance. Phys. Rev. A 91, 012120 ( 10.1103/PhysRevA.91.012120) [DOI] [Google Scholar]
  • 53.Harris J, Bouchard F, Santamato E, Zurek WH, Boyd RW, Karimi E. 2016. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule. New J. Phys. 18, 053013 ( 10.1088/1367-2630/18/5/053013) [DOI] [Google Scholar]
  • 54.Deffner S. 2017. Demonstration of entanglement assisted invariance on IBM’s quantum experience. Heliyon 3, e00444 ( 10.1016/j.heliyon.2017.e00444) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Ferrari D, Amoretti M.2018. Demonstration of envariance and parity learning on the IBM 16 qubit processor. (http://arxiv.org/abs/1801.02363. )
  • 56.Gell-Mann M, Hartle JB.1998. Strong decoherence. In Proc. 4th Drexel Conf. on Quantum Non-Integrability: The Quantum–Classical Correspondence (eds D-H Feng, B-L Hu). Hong Kong: International Press of Boston. (https://arxiv.org/abs/gr-qc/9509054. )
  • 57.Halliwell JJ. 1999. Somewhere in the universe: where is the information stored when histories decohere? Phys. Rev. D 60, 105031 ( 10.1103/PhysRevD.60.105031) [DOI] [Google Scholar]
  • 58.Blume-Kohout R, Zurek WH. 2005. A simple example of ‘quantum Darwinism’: redundant information storage in many-spin environments. Found. Phys. 35, 1857–1876. ( 10.1007/s10701-005-7352-5) [DOI] [Google Scholar]
  • 59.Blume-Kohout R, Zurek WH. 2006. Quantum Darwinism: entanglement, branches, and the emergent classicality of redundantly stored quantum information. Phys. Rev. A 73, 062310 ( 10.1103/PhysRevA.73.062310) [DOI] [Google Scholar]
  • 60.Blume-Kohout R, Zurek WH. 2008. quantum Darwinism in quantum Brownian motion. Phys. Rev. Lett. 101, 240405 ( 10.1103/PhysRevLett.101.240405) [DOI] [PubMed] [Google Scholar]
  • 61.Ollivier H, Poulin D, Zurek WH. 2004. Objective properties from subjective quantum states: environment as a witness. Phys. Rev. Lett. 93, 220401 ( 10.1103/PhysRevLett.93.220401) [DOI] [PubMed] [Google Scholar]
  • 62.Ollivier H, Poulin D, Zurek WH. 2005. Environment as a witness: selective proliferation of information and emergence of objectivity in a quantum universe. Phys. Rev. A 72, 042113 ( 10.1103/PhysRevA.72.042113) [DOI] [Google Scholar]
  • 63.Riedel CJ, Zurek WH. 2010. Quantum Darwinism in an everyday environment: huge redundancy in scattered photons. Phys. Rev. Lett. 105, 020404 ( 10.1103/PhysRevLett.105.020404) [DOI] [PubMed] [Google Scholar]
  • 64.Riedel CJ, Zurek WH. 2011. Redundant information from thermal illumination: quantum Darwinism in scattered photons. New J. Phys. 13, 073038 ( 10.1088/1367-2630/13/7/073038) [DOI] [Google Scholar]
  • 65.Zwolak M, Riedel CJ, Zurek WH. 2014. Amplification, redundancy, and the quantum Chernoff information. Phys. Rev. Lett. 112, 140406 ( 10.1103/PhysRevLett.112.140406) [DOI] [PubMed] [Google Scholar]
  • 66.Joos E, Zeh HD. 1985. The emergence of classical properties through interaction with the environment. Z. Phys. 59, 223–243. ( 10.1007/BF01725541) [DOI] [Google Scholar]
  • 67.Wootters WK, Zurek WH. 1982. A single quantum cannot be cloned. Nature 299, 802–803. ( 10.1038/299802a0) [DOI] [Google Scholar]
  • 68.Dieks D. 1982. Communication by EPR devices. Phys. Lett. A 92, 271–272. ( 10.1016/0375-9601(82)90084-6) [DOI] [Google Scholar]
  • 69.Zwolak M, Zurek WH. 2013. Complementarity of quantum discord and classically accessible information. Sci. Rep. 3, 1729 ( 10.1038/srep01729) [DOI] [Google Scholar]
  • 70.Zwolak M, Quan H-T, Zurek WH. 2009. Quantum Darwinism in a mixed environment. Phys. Rev. Lett. 103, 110402 ( 10.1103/PhysRevLett.103.110402) [DOI] [PubMed] [Google Scholar]
  • 71.Zwolak M, Quan H-T, Zurek WH. 2010. Redundant imprinting of information in nonideal environments: objective reality via a noisy channel. Phys. Rev. A 81, 062110 ( 10.1103/PhysRevA.81.062110) [DOI] [Google Scholar]
  • 72.Paz JP, Roncaglia AJ. 2009. Redundancy of classical and quantum correlations during decoherence. Phys. Rev. A 80, 042111 ( 10.1103/PhysRevA.80.042111) [DOI] [Google Scholar]
  • 73.Brandao FGSL, Piani M, Horodecki P. 2015. Generic emergence of classical features in quantum Darwinism. Nat. Commun. 6, 7908 ( 10.1038/ncomms8908) [DOI] [PubMed] [Google Scholar]
  • 74.Riedel CJ, Zurek WH, Zwolak M. 2016. The objective past of a quantum universe: redundant records of consistent histories. Phys. Rev. A 93, 032126 ( 10.1103/PhysRevA.93.032126) [DOI] [Google Scholar]
  • 75.Riedel CJ. 2017. Classical branch structure from spatial redundancy in a many-body wavefunction. Phys. Rev. Lett. 118, 120402 ( 10.1103/PhysRevLett.118.120402) [DOI] [PubMed] [Google Scholar]
  • 76.Pleasance G, Garraway BM. 2017. An application of quantum Darwinism to a structured environment. Phys. Rev. A 96, 062105 ( 10.1103/PhysRevA.96.062105) [DOI] [Google Scholar]
  • 77.Knott PA, Tufarelli T, Piani M, Adesso G.2018. Generic emergence of objectivity of observables in infinite dimensions. (http://arxiv.org/abs/1802.05719. ) [DOI] [PubMed]
  • 78.Sebens CT, Carroll SM. 2018. Self-locating uncertainty and the origin of probability in Everettian quantum mechanics. Br. J. Phil. Sci. 69, 25–74. ( 10.1093/bjps/axw004) [DOI] [Google Scholar]
  • 79.Wheeler JA. 1957. Assessment of Everett’s ‘relative state’ formulation of quantum theory. Rev. Mod. Phys. 29, 463–465. ( 10.1103/RevModPhys.29.463) [DOI] [Google Scholar]
  • 80.Wheeler JA. 1990. Information, physics, quantum: the search for links. In Complexity, entropy, and the physics of information (ed. WH Zurek), pp. 3–28. Redwood City, CA: Addison Wesley.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

This article has no additional data.


Articles from Philosophical transactions. Series A, Mathematical, physical, and engineering sciences are provided here courtesy of The Royal Society

RESOURCES