The Everett Interpretation of Quantum Mechanics:
50 years on

Philosophy Faculty, Oxford University, July 19-21, 2007

Edited by Andy Ross

Quantum mechanics has been with us for over 80 years and still there is no consensus on what it means. The Everett interpretation has now been with us for 50 of those years and is now arguably the simplest most credible explanation we have of the world. It requires no additional assumptions, no conceptual divisions between observers and observed, applies to the universe as a whole, and naturally explains probabilities arising from quantum mechanics.

July saw the 50th anniversary of the publication of Hugh Everett III's paper: Relative State Formulation of Quantum Mechanics. This was an opportunity for the leading advocates and critics to come together and debate the Everett interpretation.

Sponsored by FQXi and hosted in the Philosophy Faculty of Oxford University, 40 of the world's top academics came together for three days to see if Everett’s explanation of quantum mechanics had at last come of age.
 

Abstracts

The Everett Interpretation: 50 years on

Simon Saunders

The Everett interpretation of quantum mechanics as presently construed breaks down into an account of structure and ontology, a theory of evidence, and a theory of probability. I present some background in the context of Everett's original paper.

Decoherence and ontology

David Wallace

Decoherence is central to the Everett interpretation. Within the unitarily evolving quantum state, decoherence creates autonomous, stable, quasi-classical systems which are approximately isomorphic to classical universes. These are the many worlds of the Everett interpretation – worlds in the sense of approximately isolated, approximately classical chunks of a larger reality. To see this requires only the mathematics of decoherence theory and the right understanding of higher-order ontology. I claim that, given decoherence, unitary quantum mechanics is and must be a many-worlds theory.

Can the world be only wave-function?

Tim Maudlin

A common understanding of the many worlds theory holds that it is ontologically monistic, postulating only the existence of the wavefunction and nothing else. But it is hard to see how such an austere ontology can make comprehensible contact with the experimental facts.

Two dogmas about quantum mechanics

Jeffrey Bub and Itamar Pitowsky

We argue for an information-theoretic formulation of quantum mechanics. We consider a ‘no cloning’ principle as the crucial principle demarcating classical from non-classical information theories. We show that ‘no cloning’ entails that any measurement must sometimes irreversibly change the state of the measured system. So no complete dynamical account of measurement is possible if the ‘no cloning’ principle is true. We reject the dogmas that the process of measurement should always be open to a complete dynamical analysis and that the quantum state is analogous to the classical state as a representation of physical reality. We show that cloning is possible in both Bohmian and Everettian versions of quantum mechanics.

Everett and evidence

Wayne Myrvold and Hilary Greaves

The Everett interpretation must do justice to statistical evidence in which observed relative frequencies closely match calculated probabilities in a theory. Since, on the Everett interpretation, all outcomes with nonzero amplitude are actualized on different branches, it is not obvious that sense can be made of ascribing probabilities to outcomes of experiments. It is incumbent on the Everettian either to make sense of such probabilities or to explain how the usual statistical analysis of experimental results continues to count as evidence for quantum mechanics. We give an account of theory confirmation that applies to branching-universe theories but does not presuppose their correctness.

Probability in the Everett picture

David Z. Albert

I try to sharpen a number of worries about the possibility of making sense of quantum-mechanical probabilities in the Everett picture.

Time-symmetric quantum mechanics and the many-worlds interpretation

Lev Vaidman

I introduce a formalism that describes a quantum system at a given time not only by the standard, forward evolving wave function, but also by a backward evolving quantum state. This changes the picture of a branching tree of worlds. Instead of a tree that starts from a single root state and splits at every quantum measurement, future measurements split the worlds retroactively. Ideal quantum measurements yield identical forward and backward evolving quantum states. For macroscopic objects, splitting happens at quantum measurement. But quantum objects are described by backward evolving quantum states defined by future measurement that split the world retroactively.

Generalizing Everett's quantum mechanics for quantum cosmology

James B. Hartle

Everett took seriously the idea that quantum mechanics could apply to the universe. His ideas have since been extended to create the modern synthesis called decoherent histories or consistent histories. This is a quantum framework adequate for cosmology when gross quantum fluctuations in the geometry of spacetime can be neglected. A further generalization is needed to incorporate quantum gravity. I review a generalized quantum mechanics of cosmological spacetime geometry.

Some remarkable implications of probabilities without time

Andreas Albrecht

I consider the ambiguity in quantum gravity that arises from the choice of clock. This ambiguity leads to an absolute lack of predictability, a complete absence of physical laws. I consider an approach that could lead to a certain amount of predictability in physics.

Explaining probability

Simon Saunders

In the Everett interpretation of quantum mechanics, physical probability is identified with categorical physical properties and relations. I focus on the place of uncertainty in EQM, the semantics of uncertainty, and the explanation of the epistemology of probability.

Pilot-wave theory: Everett in denial?

Antony Valentini

I reply to claims that the pilot-wave theory of de Broglie and Bohm is really a many-worlds theory with a superfluous configuration appended to one of the worlds. I show that from the perspective of pilot-wave theory, many worlds are an illusion.
 

Press reactions

Edited by Andy Ross

Parallel universes make quantum sense

New Scientist, September 21, 2007

The days when physicists could ignore the concept of parallel universes may have come to an end. David Deutsch at the University of Oxford and colleagues have shown that key equations of quantum mechanics arise from the mathematics of parallel universes. "This work will go down as one of the most important developments in the history of science," says Andy Albrecht, a physicist at the University of California at Davis.
 

Parallel universes really do exist, say theorists

By Roger Highfield
Daily Telegraph, September 21, 2007

Quantum mechanics describes the strange things that happen in the subatomic world. By one interpretation, nothing at the subatomic scale can really be said to exist until it is observed. Until then, particles occupy nebulous "superposition" states, in which they can have simultaneous "up" and "down" spins, or appear to be in different places at the same time.

According to quantum mechanics, unobserved particles are described by "wave functions" representing a set of multiple "probable" states. When an observer makes a measurement, the particle then settles down into one of these multiple options. In the traditional brand of quantum mechanics, the wave function "collapses" to give a single real outcome.

In 1957, Hugh Everett III presented a more audacious interpretation. In his view, the universe is constantly and infinitely splitting, so that no collapse takes place. Every time there is an event at the quantum level, the universe "splits" into different universes. Every possible outcome of an experimental measurement occurs, each one in a parallel universe. In Everett's interpretation, our universe is embedded in an infinitely larger and more complex structure called the multiverse.

David Deutsch of Oxford University showed mathematically that the bush-like branching structure created by the universe splitting into parallel versions of itself can explain the probabilistic nature of quantum outcomes. This work was attacked but it has now had rigorous confirmation by David Wallace and Simon Saunders, also at Oxford.

Deutsch added that the work addresses the idea of probability itself. "The problems of probability, which were until recently considered the principal objection to the otherwise extremely elegant theory of Everett (which removes every element of mysticism and double-talk that have crept into quantum theory over the decades) have now turned into its principal selling point."
 

Parallel universes exist — study

Breitbart, September 23, 2007

Parallel universes really do exist, according to a mathematical discovery by Oxford scientists. The parallel universe theory, first proposed in 1950 by the US physicist Hugh Everett, helps explain mysteries of quantum mechanics that have baffled scientists for decades, it is claimed. In Everett's "many worlds" universe, every time a new physical possibility is explored, the universe splits. Given a number of possible alternative outcomes, each one is played out in its own universe.

The new research from Oxford shows that it offers a mathematical answer to quantum conundrums that cannot be dismissed lightly - and suggests that Dr Everett, who was a PhD student at Princeton University when he came up with the theory, was on the right track. The Oxford team, led by Dr David Deutsch, showed mathematically that the bush-like branching structure created by the universe splitting into parallel versions of itself can explain the probabilistic nature of quantum outcomes.
 

Technical Details

The Everett Interpretation: 50 years on

Simon Saunders
June 19, 2007
PDF: 12 pages, 107 KB

Edited by Andy Ross (with apologies to Simon)

The problem of measurement

The problem of measurement is the problem of reconciling two kinds of dynamical evolution postulated in quantum mechanics. The first kind is deterministic and incorporates space-time symmetries. It is the unitary dynamics. The second is indeterministic, apparently unrelated to any space-time symmetry, without any dynamical structure. It is the quantum jump or the collapse of the wave-function, onto one of a large number of wave-functions that were previously superposed, when a measurement is performed.

The projection postulate is that on collapse, at least in the case of an experiment where the quantity measured can be measured again on the same system, a new quantum-mechanical state is introduced. The dynamical variable that the experiment is designed to measure is assigned a value in this way.

Physicists have historically tried to see these measurement postulates as a reflection of some sort of philosophical limitation to physical theorizing or the expression of laws. If so, the measurement postulates need not signify anything wrong with quantum mechanics.

We suppose that a satisfactory solution of the problem of measurement can be put into creeds:

(1) The problem of measurement should be solved by clear and simple reasoning that can at least schematically be stated in non-relativistic quantum mechanics and can at least schematically be applied to the universe as a whole.

(2) The solution should be applicable to relativistic quantum theory as well and specifically to the standard model.

(3) There should be no special status in the interpretation for the observer, experiment, sub-system, or environment, unless questions of evidence or beliefs are explicitly invoked. Otherwise, such entities should be modeled as physical systems or subsystems or physical processes, just like any other.

(4) It is in principle legitimate to view the wave-function as physically real and as applicable to the universe as a whole.

Everettian quantum mechanics (EQM) as it was originally formulated met (2), (3) and (4), and went some way to meeting (1). But the quantum theory that emerges, purged of the measurement postulates, is fantastical. It only avoids the measurement problem insofar as it describes all physically possible outcomes to such a process as physically real.

Everett's relative states

Everett showed that at the macroscopic level the development of a single component of the wave-function into a superposition will in a certain sense be invisible. For suppose we have a unitary dynamical evolution taking the total system from an initial state Ψ0 to a final state Ψt. Suppose that the spin system in the initial state |↑> couples to the apparatus so as to yield the outcome spin-up with certainty, and likewise when the initial state is |↓> the outcome is spin-down with certainty. Then the dynamics is:

Ψ0 = | ready > × |↑> → Ψt = | spin-up > × |↑>

Ψ0 = | ready > × |↓> → Ψt = | spin-down > × |↓>

In either case, no measurement postulate is needed. The outcome can be predicted with certainty merely from the unitary dynamics.

If | ready >, | spin-up > and | spin-down > denote the wave function not just of the apparatus but of the environment as well, then these states describe ordinary macroscopic states of affairs.

Now consider the result if the spin system is initially prepared in a superposition of those two states, say c |↑> + d |↓>. This is supposed to yield trouble. But if we consider the final state as dictated by the same unitary evolution —

Ψ0 = | ready > × (c |↑> + d |↓>) → Ψt = c | spin-up > × |↑> + d | spin-down > × |↓>    (S)

— then either of the states | spin-up > and | spin-down > likewise describes an ordinary state of affairs, in each of which a definite outcome is recorded, just as before. In a series of repetitions of the experiment, with the recording instrument storing the outcomes one by one, the superposition is again a superposition of states each of which describes an ordinary state of affairs, a sequence of outcomes, a definite record of statistics. The superposition itself cannot be encoded in a record in any branch in this way. It is in this sense invisible.

Everett said that relative to the state | spin-up > there is a relative state |↑> of the spin system. This provided a way of presenting the basic ideas, without talking explicitly of many worlds. Everett called it the relative-state formulation of quantum mechanics.

Everett had little more to say than this. His contribution was in a way rather minimal. Everett pointed out that branching would be invisible so long as everything was branching together.

But there is a certain difficulty. What are Everett's states describing the macroscopic, and why are those states the right ones to choose as defining relative states? What is the natural or preferred basis, with respect to which the universe is in a superposition? This is the preferred basis problem.

The interpretation faced another problem. It was intended to make sense of the unitary, covariant, and deterministic dynamics. How to reconcile this with the probabilistic interpretation of the theory? As conventionally formulated, probabilities only come in to quantum mechanics with the measurement postulates. Thus, it is only the measurement postulates that tells you a superposition like (S) means that one of the states | spin-up > × |↑> or | spin-down > × |↓> results, with probabilities || c ||2 and || d ||2 respectively. If the superposition actually remains, in what sense does either state occur with some probability?

Decoherence theory

The basis to be used in defining the branching structure is only effective; it should not matter to the macroscopic description if it is tweaked this way or that.

Branching is a real dynamical structure to the universal state. It is decoherence. A basis adapted to this dynamical structure is the one to make those patterns clear. But it is defined only for all practical purposes (FAPP).

The philosophy is an obvious one, if classical worlds are higher-order ontology, structures in the universal state. They arise through a coarse-graining of an underlying physics that does not have to be known exactly.

Classicality for all practical purposes was symptomatic of a failure of realism, but from an Everettian point of view that is simply a mistake. The fundamental theory itself must be defined precisely, but the classical is an empirical consequence of the theory. And in extracting the empirical consequences of a physical theory, everyone is agreed that approximations can and should play a fundamental role.

The underlying philosophy was that superpositions of states describing different macroscopic properties were somehow forbidden.

Another method for defining decoherence was the consistent histories theory. The mathematical tool is the histories formalism itself (a history space), and the criterion of consistency (or decoherence). The quasiclassical history space yields the structure of the universal state that we have so far been concerned with: the system of branching and approximately classical worlds.

Why not suppose only one of these histories is real? If there is only one world, the universal state has only the meaning of a probability measure on the history space, when what exists is the single history. Why not try to describe it precisely? For a start the consistency condition had better be precisely satisfied. One is a long way from the perspective of classicality as an effective theory.

It is different if the ultimate reality is the universal state. In that case a history space concerns only an effective level of description of the structure of the state, better or worse suited to extracting useful phenomenological equations. The structure itself is emergent, imprecise at its boundaries and in its minutiae, like galaxies and planetary systems. Worlds in the Everett interpretation are really like worlds, planetary systems that are tightly bound together, but only weakly coupled to other words, and systems without precise borders or edges.

Probability

Branching is only effective, so too is quantum probability. Probabilistic events, according to the Everett interpretation, occur when branching occurs, when an element of the decoherence basis unitarily evolves into a superposition of such elements.

An objection to the Everett interpretation was that if branching really occurs then there is a natural alternative measure over branches to the Born rule: that for which all branches are equiprobable. But if branching only occurs on decoherence, then there is no such measure that applies to branches at the level at which they themselves are defined. There are fat branches and thin ones, as given by the Born rule; there is no number of branches which are fat, no number which are thin. This is the first of three crucial questions concerning probability. They are:

(i) What of branches with records of anomalous statistics?

(ii) Is there any place for epistemic uncertainty in the face of branching?

(iii) How, if at all, is the Born rule to be justified?

Deutsch derived the Born rule from certain symmetry arguments and appeal to certain axioms of decision theory.

The general idea is this: let rational agents express likelihood relations among quantum experiments M, N, ..., whose outcomes are sets of events E, F, G, ..., yield dividends whose utility is selected by the agent at will. Let E|M ≥ F|N mean that, in the agent's expectation it is at least as likely that E will happen given M as that F will happen given N. For an experiment, M, let EM be the set of all possible outcomes, and let Ø be the empty set. Then an ordering of likelihoods is represented by a credence function Pr if

Pr(Ø|M) = 0 and Pr(EM|M) = 1

If E and F are disjoint then Pr(E u F|M) = Pr(E|M) + Pr(F|M)

Pr(E|M) ≥ Pr(F|N) iff E|M ≥ F|N

We suppose agents are rational insofar as they subscribe to the principles:

Transitivity. If E|M ≥ F|N and F|N ≥ G|O, then E|M ≥ G|O. Transitivity requires that likelihoods are comparable.

Separation. There exists some E and M such that E|M is not null. Separation requires that some event is possible.

Dominance. If E is a subset of F, then F|M ≥ E|M for any M, with F|M ~ E|M iff E – F is null, where an event E is null given M if
E|M ~ Ø|M. Dominance requires that the likelihood of a set of events is greater than that of any proper subset, or at least the same in the case that the omitted events are impossible.

A further principle introduces the usual Born probability or weight WM(F) for outcome F on performance of experiment M:

Equivalence. F|M ~ E|N if and only if WM(F) = WM(F). Equivalence is the principle that outcomes of equal weight have equal credence.

We are interested in situations where there are enough experiments available so that decision theory can bite. We define a set M of quantum experiments to be rich provided that, for any positive real numbers w1, ..., wn, such that (w1 + ... + wn) = 1, M includes a quantum experiment with n outcomes having weights w1, ..., wn.

We can now state the representation theorem, due to Deutsch and Wallace. If the likelihood orderings of a rational agent satisfy Equivalence, then they are uniquely representable by a credence function Pr where Pr(E|M) = WM(F).

If Equivalence can be viewed as a principle of rationality, the Everett interpretation is in good shape. The notion of objective probability has long troubled empiricists. Credence or subjective probability is in contrast perfectly clear. But just why credence should track chance can hardly be explained until we know what chance is. Their relation may have the irreducible status of a brute posit. In EQT it is enough if equal chances have equal credences.

There is another aspect to the assessment of EQT as a probabilistic theory. It may be that one who believes EQT is true will match her credences to quantum mechanical weights, but how is one to update a prior probability measure (credence) over two or more competing theories in the face of the observed relative frequencies? Must one already believe that EQT is true, in order to deduce from the observed statistics that quantum mechanics is better confirmed than some rival?

The Everett interpretation undermines so many common beliefs so as to threaten the very basis on which evidential claims for quantum mechanics are evaluated. This question returns us to (ii) in our list above, of whether there is any place for uncertainty in EQM. If the answer to (ii) is no, it might be argued that rational agents can have no notion of a likelihood relation either: if nothing is uncertain, how can any event be more likely than another?

First, uncertainty is not needed for the representation theorem, which can just use relative weight. Nor is it needed for a (Bayesian) confirmation theory. Second, what is at stake is what our ordinary words actually mean, in a way that is dictated by use. If we go over to EQT, the new theory may or may not be consistent with what we were previously inclined to say. That plunges us back into philosophy.

Granted that the Everett interpretation is a literalist construal of dynamical unitary evolution, it would be astonishing if a different realist interpretation of the theory were possible. If these claims are true, our best physical theory is telling us that we live in a branching universe. And that the measurement problem is solved.

 

AR  (2012) All this is still deeply interesting work.