This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The purpose of this talk is twofold: First, following Spekkens, to motivate noncontextuality as a natural principle one might expect to hold in nature and introduce operational noncontextuality inequalities motivated by a contextuality scenario first considered by Ernst Specker. These inequalities do not rely on the assumption of outcome-determinism which is implicit in the usual Kochen-Specker (KS) inequalities. We argue that they are the appropriate generalization of KS inequalities, serving as a test for the possibility of noncontextual explanations of experimental data.
It is not unnatural to expect that difficulties lying at the foundations of quantum mechanics can only be resolved by literally going back and rethinking the quantum theory from first principles (namely, the principles of logic). In this talk, I will present a first-order quantum logic which generalizes the propositional quatum logic originated by Birkhoff and von Neumann as well as the standard classical predicate logic used in the development of virtually all of modern mathematics.
On the face of it, quantum physics is nothing like classical physics. Despite its oddity, work in the foundations of quantum theory has provided some palatable ways of understanding this strange quantum realm. Most of our best theories take that story to include the existence of a very non-classical entity: the wave function. Here I offer an alternative which combines elements of Bohmian mechanics and the many-worlds interpretation to form a theory in which there is no wave function.
We present a method for determining the maximum possible violation of any linear Bell inequality per quantum mechanics. Essentially this amounts to a constrained optimization problem for an observable’s eigenvalues, but the problem can be reformulated so as to be analytically tractable. This opens the door for an arbitrarily precise characterization of quantum correlations, including allowing for non-random marginal expectation values. Such a characterization is critical when contrasting QM to superficially similar general probabilistic theories.
The standard formulation of quantum mechanics is
operationally asymmetric with respect to time reversal---in the language of
compositions of tests, tests in the past can influence the outcomes of test in
the future but not the other way around. The question of whether this represents
a fundamental asymmetry or it is an artifact of the formulation is not a new
one, but even though various arguments in favor of an inherent symmetry have
been made, no complete time-symmetric formulation expressed in rigorous
Although it can only
be argued to have become consequential in the study of quantum cosmology, the
question ``Why do we observe a classical world? " has been one of the
biggest preoccupations of quantum foundations. In the consistent
histories formalism, the question is shifted to an analysis of
the telltale sign of quantum mechanics: superposition of states. In
the consistent histories formalism, histories of the system which
``decohere", i.e. fall out of superposition or have negligible
In systems described
by Ising-like Hamiltonians, such as spin-lattices, the Bell Inequality can be
strongly violated. Surprisingly, these systems are both local and
non-superdeterministic. They are local, because 1) they include only local,
near-neighbor interaction, 2) they satisfy, accordingly, the Clauser-Horne
factorability condition, and 3) they can violate the Bell Inequality also in dynamic
Bell experiments. Starting from this result we construct an elementary
Coalgebras
are a flexible tool commonly used in computer science to model abstract devices
and systems. Coalgebraic models also come with a natural notion of logics
for the systems being modelled. In this talk we will introduce coalgebras
and aim to illustrate their usefulness for modelling physical systems.
Extending earlier work of Abramsky, we will show how a weakening of the
usual morphisms for coalgebras provides the flexibility to model quantum
systems in an easy to motivate manner.
We describe a notion of state for a quantum system which is given in terms of a collection of empirically realizable probability distributions and is formally analogous to the familiar concept of state from classical statistical mechanics. We first demonstrate the mathematical equivalence of this new notion to the standard quantum notion of density matrix. We identify the simple logical consistency condition (a generalization of the familiar no-signalling condition) which a collection of distributions must obey in order to reconstruct the unique quantum state from which they arise.
Quantum observables
are commonly described by self-adjoint operators on a Hilbert space H. I will
show that one can equivalently describe observables by real-valued functions on
the set P(H) of projections, which we call q-observable functions. If one regards
a quantum observable as a random variable, the corresponding q-observable
function can be understood as a quantum quantile function, generalising the
classical notion. I will briefly sketch how q-observable functions relate to