This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Consider discrete physics with a minimal time step taken to be
tau. A time series of positions q,q',q'', ... has two classical
observables: position (q) and velocity (q'-q)/tau. They do not commute,
for observing position does not force the clock to tick, but observing
velocity does force the clock to tick. Thus if VQ denotes first observe
position, then observe velocity and QV denotes first observe velocity,
then observe position, we have
VQ: (q'-q)q/tau
QV: q'(q'-q)/tau
The start of the talk will be an outline how the ordinary notions of quantum theory translate into the category of C*-algebras, where there are several possible choices of morphisms. The second half will relate this to a category of convex sets used as state spaces. Alfsen and Shultz have characterized the convex sets arising from state spaces C*-algebras and this result can be applied to get a categorical equivalence between C*-algebras and state spaces of C*-algebras which is a generalization of the equivalence between the Schroedinger and Heisenberg pictures.
There is now a remarkable mathematical theory of causation. But applying this theory to a Bell scenario implies the Bell inequalities, which are violated in experiment. We alleviate this tension by translating the basic definitions of the theory into the framework of generalised probabilistic theories. We find that a surprising number of results carry over: the d-separation criterion for conditional independence (the no-signalling principle on steroids), and even certain quantitative limits on correlations.
We introduce a new way of quantifying the degrees of incompatibility of two observables in a probabilistic physical theory and, based on this, a global measure of the degree of incompatibility inherent in such theories. This opens up a flexible way of comparing probabilistic theories with respect to the nonclassical feature of incompatibility. We show that quantum theory contains observables that are as incompatible as any probabilistic physical theory can have.
Since the 1909 work of Carathéodory, an axiomatic approach to thermodynamics has gained ground which highlights the role of the the binary relation of adiabatic accessibility between equilibrium states. A feature of Carathédory's system is that the version therein of the second law contains an ambiguity about the nature of irreversible adiabatic processes, making it weaker than the traditional Kelvin-Planck statement of the law.
There has recently been much interest in finding simple principles that explain the particular sets of experimental probabilities that are possible with quantum mechanics in Bell-type experiments. In the quantum gravity community, similar questions had been raised, about whether a certain generalisation of quantum mechanics allowed more than quantum mechanics in this regard. We now bring these two strands of work together to see what can be learned on both sides.
Central to quantum theory, the wavefunction is a complex distribution associated with a quantum system. Despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of it through its use to calculate measurement outcome probabilities through the Born Rule. Tomographic methods can reconstruct the wavefunction from measured probabilities.
The status of the quantum state is perhaps the most controversial issue in the foundations of quantum theory. Is it an epistemic state (representing knowledge, information, or belief) or an ontic state (a direct reflection of reality)? In the ontological models framework, quantum states correspond to probability measures over more fundamental states of reality. The quantum state is then ontic if every pair of pure states corresponds to a pair of measures that do not overlap, and is otherwise epistemic.
If a wave function does not describe microscopic reality then what does? Reformulating quantum mechanics in path-integral terms leads to a notion of ``precluded event" and thence to the proposal that quantal reality differs from classical reality in the same way as a set of worldlines differs from a single worldline. One can then ask, for example, which sets of electron trajectories correspond to a Hydrogen atom in its ground state and how they differ from those of an excited state.
The purpose of this talk is twofold: First, following Spekkens, to motivate noncontextuality as a natural principle one might expect to hold in nature and introduce operational noncontextuality inequalities motivated by a contextuality scenario first considered by Ernst Specker. These inequalities do not rely on the assumption of outcome-determinism which is implicit in the usual Kochen-Specker (KS) inequalities. We argue that they are the appropriate generalization of KS inequalities, serving as a test for the possibility of noncontextual explanations of experimental data.