This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Weak values were introduced by Aharonov, Albert, and Vaidman 25 years ago, but it is only in the last 10 years that they have begun to enter into mainstream physics. I will introduce weak values as done by AAV, but then give them a modern definition in terms of generalized measurements. I will discuss their properties and their uses in experiment. Finally I will talk about what they have to contribute to quantum foundations.
The process of canonical quantization:is reexamined with the goal of
ensuring there is only one reality, where $\hbar>0$, in which classical
and quantum theories coexist. Two results are a clarification of the effect of
canonical coordinate transformations and the role of Cartesian coordinates.
Other results provide validation
The
"psi-epistemic" view is that the quantum state does not represent a
state of the world, but a state of knowledge about the world. It is
motivated, in part, by the observation of qualitative similarities between
characteristic properties of non-orthogonal quantum wavefunctions and between
overlapping classical probability distributions. It might be suggested
that this gives a natural explanation for these properties, which seem puzzling
for the alternative "psi-ontic" view. I will examine two such
The Wigner-Araki-Yanase (WAY) theorem delineates
circumstances under which a class of quantum measurements is ruled out.
Specifically, it states that any observable (given as a self adjoint operator)
not commuting with an additive conserved quantity of a quantum system and
measuring apparatus combined admits no repeatable measurements. I'll review the
content of this theorem and present some new work which generalises and
strengthens the existing results.
We
develop a theory for describing composite objects in physics. These can be
static objects, such as tables, or things that happen in spacetime (such as a
region of spacetime with fields on it regarded as being composed of smaller
such regions joined together). We propose certain fundamental axioms which, it
seems, should be satisfied in any theory of composition. A key axiom is the
order independence axiom which says we can describe the composition of a
A quantum steering ellipsoid may be used to faithfully represent
a density matrix ? describing two qubits A and B. The ellipsoid is the
geometric set of states that Bob can steer Alice's qubit to when he implements
all possible measurements on his qubit. We show how the correlations between
qubits A and B manifest themselves in this paradigm, giving simple conditions
for when the state is entangled, or has discord. We will also present novel features
Crudely formulated, the idea of neorealism, in the way that
Chris Isham and Andreas Doering use it, means that each theory of
physics, in its mathematical formulation should share certain structural
properties of classical physics. These properties are chosen to allow some degree of
realism in the interpretation (for example, physical variables always have values).
Apart from restricting the form of physical theories, neorealism does
increase freedom in the shape of physical theories in another
The general boundary formulation (GBF) is an atemporal, but spacetime local formulation of quantum theory. Usually it is presented in terms of the amplitude formalism, which, in the presence of a background time, recovers the pure state formalism of the standard formulation of quantum theory. After reviewing the essentials of the amplitude formalism I will introduce a new "positive formalism", which recovers instead a mixed state formalism.