Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
The normalized-state spaces of finite-dimensional Jordan algebras constitute a relatively narrow class of convex sets that includes the finite-dimensional quantum mechanical and classical state spaces. Several beautiful mathematical characterizations of Jordan statespaces exist, notably Koecher's characterization as the bases of homogeneous self-dual cones, and Alfsen and Shultz's characterization based on the notion of spectral convex sets plus additional axioms.
The Quantum Bayesianism of Caves, Fuchs and Schack presents a distinctive starting point from which to attack the problem of axiomatising - or re-constructing - quantum theory. However, many have had the doubt that this starting point is itself already too radical. In this talk I will briefly introduce the position (it will be familiar to most, no doubt) and describe what I take to be its philosophical standpoint. More importantly, I shall seek to defend it from some bad objections, before going on to level some more substantive challenges.
Quantum Mechanics (QM) is a beautiful simple mathematical structure--- Hilbert spaces and operator algebras---with an unprecedented predicting power in the whole physical domain. However, after more than a century from its birth, we still don't have a "principle" from which to derive the mathematical framework. The situation is similar to that of Lorentz transformations before the advent of the relativity principle.
Recent advances in quantum computation and quantum information theory have led to revived interest in, and cross-fertilisation with, foundational issues of quantum theory. In particular, it has become apparent that quantum theory may be interpreted as but a variant of the classical theory of probability and information. While the two theories may at first sight appear widely different, they actually share a substantial core of common properties; and their divergence can be reduced to a single attribute only, their respective degree of agent-dependency.
The starting point of the reconstruction process is a very simple quantum logical structure on which probability measures (states) and conditional probabilities are defined. This is a generalization of Kolmogorov's measure-theoretic approach to probability theory. In the general framework, the conditional probabilities need neither exist nor be uniquely determined if they exist. Postulating their existence and uniqueness becomes the major step in the reconstruction process.
In our approach, rather than aiming to recover the 'Hilbert space model' which underpins the orthodox quantum mechanical formalism, we start from a general `pre-operational' framework, and verify how much additional structure we need to be able to describe a range of quantum phenomena. This also enables us to investigate which mathematical models, including more abstract categorical ones, enable one to model quantum theory.
It will be shown that the conventional (i.e. real or complex Hilbert space) model of quantum mechanics can be deduced from the indistinguishability of the simplest types of statistical mixtures. The result does not have the low dimension exclusion of the quantum logic approach.