**By Chris Clarke**

**Overview**

Quantum Theory (often called Quantum Mechanics or Quantum Physics – the terms are used differently by different authors) is an extension of physics in order to cover the behaviour of microscopic objects. Physics as it was before Quantum Theory is called Classical Physics. On some versions Quantum Theory includes Classical Physics as a special case. From the start the theory was subject to controversy and developed into a wealth of different forms, mostly agreeing at the level of practical calculation but disagreeing wildly as to the interpretation. The question “what is quantum theory” is therefore a difficult one.

Both Classical and Quantum Physics describe how the observable properties of a system change with time. The “system” (which here means “thing”) can be anything from an atom to the universe; its properties are quantities like position, momentum, energy, the internal arrangements of its parts and so on.

In Classical Physics there is a set of properties for any given system (namely the positions and velocities of all its parts) which completely determines its time-development and the properties at any later time. In Quantum Physics there is no such complete set of properties. Instead at any given time there are many different possible sets of properties, any one of which sets can be observed; but it is not possible to observe all the properties simultaneously. For instance, position and velocity cannot be observed simultaneously; the first gives a particle-picture the second a wave-picture. The existence of different possible sets of properties is called complementarity.

Any properties at a later time cannot (except in special circumstances) be determined by observing properties at an earlier time. Only their probabilities are fixed by the earlier observation. This indeterminism is the basis of the continual openness of the universe to new possibilities. When combined with complementarity it may provide the notion of free creativity the universe (see *Quantum Logic *below).

The term observed means different things in different versions: e.g. “manifested,” “recorded by a macroscopic instrument,” “brought to (human?) consciousness” and so on. The last possibility links quantum theory with theories of mind. At any given time there is a well defined specification of the probability of observing any given property. This collection of probabilities is fixed by (or in some versions is identical with) the quantum state, but this state is not itself observable. Interpretations differ as to whether the state is real or a mathematical abstraction, with profound consequences for the whole notion of reality in physics.

The earliest interpretations, dating from workers in Copenhagen, used a two-tier world: a small system obeying non-Classical Physics and an observing laboratory obeying Classical Physics. The many pre-1965 theories tend to call themselves “The Copenhagen Interpretation.” Later interpretations tried to achieve a more unified view. This historical development introduced a succession of alternative structures: the collapse of the state, many worlds, environmental diffusion and so on. Although the early version lives on as a practical rule for the working physicist, for those researching into the foundations of the subject these early versions have been superseded by an interpretation called the *histories interpretation*which makes far fewer metaphysical assumptions.

Systems with infinitely many degrees of freedom (in particular, fields such as the electromagnetic field) are described by quantum field theory whose states can all be constructed out of a special state of the field in question called the vacuum for that field. Thus the vacuum is not some special new entity that brings into being electrons, protons and so on. Rather the electron vacuum is a special state of the electron quantum field, the proton vacuum a special state of the proton quantum field, and so on. One can combine these together to produce a single vacuum, which is a special state of the combined particle fields. The vacuum has zero energy (except in Dirac’s theory which enjoyed brief popularity).

**Some basic concepts**

Bearing in mind that many of these concepts belong to earlier versions of the theory that are no longer regarded as essential, some of the central ideas are as follows.

Quantum mechanics is usually regarded as a more general type of mechanics than traditional classical physics. The different system of mechanics is required (at least) when the size of a typical action in a system (the product of a typical energy and a typical time) becomes so small that it is comparable to a fundamental constant of nature called Planck’s constant, which has a very small value when measured in the usual laboratory-scale units. Traditional (Copenhagen) Quantum mechanics thinks of processes as having three phases: *Preparation, Evolution *and *Observation.*

In *Preparation *a microscopic system (such as an electron) is prepared (for instance, by being emitted from a heated wire and then projected into a vacuum by appropriate electric fields). *Evolution *then takes place while the system is left undisturbed – for instance, the electron may pass into an apparatus where a numerical result will be measured. Evolution is a deterministic process, governed by a precise equation called Schrödinger’s equation. Finally, *observation *is the actual measurement of the numerical result, which is *indeterministic. *If the system is not destroyed by the measurement (destruction of a photon, for instance, arising when it is absorbed in a photographic emulsion) then after the measurement the system can go to a further apparatus, with the first measurement serving as a preparation for the second.

By taking a wider view of what is meant by a system in quantum mechanics, one can regard every measurement (or observation) as a preparation of some new system, so that the distinction between preparation and observation collapses, and we have a single event that we can call *manifestation. *The basic concept of a process is then a sequence of manifestations, with evolution taking place between each one. Such a sequence is called a *history. *The deterministic evolution and the indeterministic observation are combined into a single principle that prescribes the probabilities for all the possible histories that might occur.

**Quantum logic**

At an even more basic level, quantum theory is just a very general way of talking about processes. At this level, before one introduces the particular laws of particular processes, the theory is called Quantum logic. A “logic” in this sense is a mathematical structure consisting, at least, of a collection of entities called propositions and a number of operations which combine propositions so as to produce new ones, including the operations “and”, “or” and “not”. Propositions correspond to quantum measurements that have a yes/no answer, such as “is the electron inside this box?” The rules of the operations on propositions are the same as for conventional logic (known as Boolean Logic) except that the distributive law “{A and (B or C)} is equivalent to {(A and B) or (B and C)}” is replaced by a rather technical weaker condition called orthocomplementarity. This makes it possible for a quantum logic to contain a number of different Boolean Logics which are incompatible with each other. These can be interpreted as different “frames of reference”, such as the wave representation or the particle representation of quantum states. Quantum logic makes sense of “Both/and” thinking: an electron is both a wave and a particle – but not within the same frame of reference.

The way we actually think, particularly in our more creative moments, is better described by quantum logic than by classical logic, because we can creatively move from one frame of reference to another, devising new ways of thinking about things. Quantum logic represents a creative approach to a creative universe, while classical logic represents a rigid approach to a deterministic universe.

**Connectivity and the Aspect experiment**

Perhaps the most important aspect of quantum theory is that quantum state are usually not states of single particles, but states of systems of many particles, states that cannot be reduced to statements about the individual particles. When this happens the particles are said to be entangled. Its practical consequence is that the unpredictable responses of the various particles are linked to each other, even if they are widely separated in space, giving a fundamental connectivity to the universe.

An experiment, performed by Alain Aspect, verified a particular case of this discussed in the early days of the theory by Einstein, Rosen and Podolsky. It demonstrates a kind of distant connectivity between widely separated photons (particles of light). The two photons are created by passing a single photon into a specially synthesized crystal that splits it into two daughter photons, each with half the energy of the first. The two daughter photons are then allowed to travel apart (recently the distance has been taken to 40 km) and simultaneously each is passed through a prism which measures its polarization in a particular direction. The two photons respond at random to the measurements, and statistics are collected which show that there are correlations between the responses of the two photons. The key to the analysis is an argument, due to John Bell, which demonstrates that the particular nature of the correlations, and the way they vary with the angle of the prisms measuring the polarization, implies that it is impossible for the correlations to arise from any individual properties of the two photons, whatever these properties might be: the photons have to react in a coordinated, connected way. This crucial experiment indicates that the universe is connected at a much deeper level than previously thought.

Figure 1

In the Figure 1, from one of Aspect’s papers, *S *is the source of the two photons, moving in opposite directions until they are a distance *L*apart, when they enter the measurement part of the experiment. Here two high-speed light “switches”, *C1 *and *C2, *direct each of photon to one of two possible polarization measuring devices (here represented as a polaroid filter, rather than a prism, followed by a photomultiplier tube *PM *which detects the photons). The “switches” are activated after the photons set off, so that there is no possibility of the photons being influenced as they leave by the nature of the filters that they will encounter. The filters are pre-set at four different angles, and statistics of the number of “coincidences” – events where two of the photomultipliers fire simultaneously – are collected for a variety of combinations of angles.

The physicist John Bell introduced an argument that demonstrated that the pattern of coincidences that was observed could not possibly be produced if each photon responded independently: they had to be connected in some way. Bell’s argument is quite mathematical, but there is a simpler version due to Mermin which can be understood with a little effort. In this version the directions of the prisms/filters (determining the direction of the polarization that a photon has if it passes straight through the prism) are restricted to just three possibilities, symmetrically arranged. (The effect of the prism in unchanged by a 180-degree rotation, so the three distinct orientations of the prisms are defined by three positions of a line drawn through the prism, as shown in Figure 2 below:

Figure 2 *Positions of either prism *Figure 3* Relative angles from A to B*

If we denote the two prisms by A and B, then the combined effect of the prisms depends only the relative angle from the line representing A to that representing B, which is either 0, 60 or 120 degrees (see Figure 3).

The probabilities for what A does depend on what B does and on the relative angle. The statistics that are observed are as follows:

Probability of B transmitting when A is transmitting | Probability of B transmitting when A is not transmitting | ||||||

Relative |
B transmitting? |
Relative |
B transmitting? |
||||

Yes |
No |
Yes |
No |
||||

0 |
1 |
0 |
0 |
0 |
1 |
||

60 |
1/4 |
3/4 |
60 |
3/4 |
1/4 |
||

120 |
1/4 |
3/4 |
120 |
3/4 |
1/4 |

The fact that the two tables are different shows that there is a correlation between whether the photons go through A and whether they go through B. But this is not necessarily evidence of *connection*between the two: it could be that they have a coordinated “plan” of response laid down at the time they are emitted from the common source, with the response depending on the possible angles of the prisms encountered by the photons. The plan can be randomly chosen in advance, but there can be no variation from it once the photons have set off, otherwise one would not get a guaranteed agreement in the case where the prisms are in line.

The key step in the argument now follows. There is a result called Bell’s theorem showing that no such plan, in which each photon responds *separately* once it has left, can work. I am going to prove the special case of this result as it applies to the set-up here. Because of this result we must assume that the photons are responding *in a coordinated way* to the conditions of the prisms.

The proof is straight forward but does require a little care.

Consider one example of a possible plan. In any plan A and B have to respond in the same way to each angles of a prism so as to get agreement when the relative angle is zero (alignment), so a possible plan of response might be:

Angle (absolute) |
0 |
60 |
120 |

Response |
Yes |
No |
No |

Such a feature can be found by looking at the pattern of agreements (Ag) and disagreements (Dis) on this plan

Can we show that every plan has some feature that is not reflected in what is observed?

Angle of A Angle of B |
0 |
60 |
120 |

0 |
Ag |
Dis |
Dis |

60 |
Dis |
Ag |
Ag |

120 |
Dis |
Ag |
Ag |

The proportion of Agreements: Disagreements = 5:4

*It turns out that every plan gives more agreements than disagreements.
How does this compare with observation? The previous tables show that the proportions are actually*

Relative angle |
Agree |
Disagree |

0 |
1 |
0 |

60 |
1/4 |
3/4 |

120 |
1/4 |
3/4 |

The average of the proportions in the “Agree” column is 1/2, as is that in the “disagree” column. So if we carry out a series of trials where the angles are set at random each time, then we will on average get equal proportions of disagreements and agreements, contradicting the possibility that the response is happening according to a plan.

This proves (the special case of) Bell’s theorem.

We can think of each photon as responding to a *context *that is non-local: the context includes the other photon and the other prism. Randomness is thus context-bound.

What we see is that, in the area that is most mechanistic – particle physics – we get an affirmation of what we suspect from our everyday experience anyway: that the universe is fundamentally connected across space, that context is non-local; and that randomness is bound up with context in the way that we recognise from the concept of freedom.