This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
In the Scholium in Newton's Principia which contains the discussions about absolute space, time, and the bucket experiment, Newton also posed a problem that Julian Barbour has denoted the "Scholium problem". Newton writes there "But how are we to obtain the true motions from their causes, effects, and apparent differences, and the converse, shall be explained more at large in the following treatise. For to this end it was that I composed it". This problem was clearly considered very important by Newton who claims he wrote the Principia dedicated to this problem.
In this talk, I will demonstrate that correlations inconsistent with any locally causal description can be a generic feature of measurements on entangled quantum states. Specifically, spatially-separated parties who perform local measurements on a maximally-entangled state using randomly chosen measurement bases can, with significant probability, generate nonclassical correlations that violate a Bell inequality. For n parties using a Greenberger-Horne-Zeilinger state, this probability of violation rapidly tends to unity as the number of parties increases.
How can we describe a device that takes two unknown operational boxes
and conditionally on some input variable connects them in different
orders? In order to answer this question, I will introduce maps from
transformations to transformations within operational probabilistic
theories with purification, and show their characterisation in terms
of operational circuits. I will then proceed exploring the hierarchy
of maps on maps. A particular family of maps in the hierarchy are the
In this talk I will report on a recent work [arXiv:0908.1583], which investigates general probabilistic theories where every mixed state has a purification, unique up to reversible channels on the purifying system. The purification principle is equivalent to the existence of a reversible realization for every physical process, namely that to the fact that every physical process can be regarded as arising from the reversible interaction of the input system with an environment that is eventually discarded.
CMB measurements reveal a very smooth early universe. We propose a mech- anism to make this smoothness natural by weakening the strength of gravity at early times, and therefore altering which initial conditions have low entropy.
For a quantum system with a d-dimensional Hilbert space, a symmetric informationally complete measurement (SIC) can be thought of as a set of d^2 pure states all having the same overlap. Constructions of SICs for composite systems usually do not make use of the composite structure but treat the system as a whole. Indeed for some cases, one can prove that a SIC cannot have the symmetry that one naturally associates with the composite structure.
In his brilliant article "Against 'Measurement'", John Bell famously
argued that the word has had such a damaging effect on the discussion,
that it should now be banned altogether in quantum mechanics. But in
the beginning was the word, and the word is still with us. Indeed,
David Mermin responded In Praise of Measurement that within the field
of quantum computer science the concept of measurement is precisely
defined, unproblematic, and forms the foundation of the entire
subject, a verdict reaffirmed by the development of measurement-based
A closer look at some proposed Gedanken-experiments on BECs promises to shed light on several aspects of reduction and emergence in physics. These include the relations between classical descriptions and different quantum treatments of macroscopic systems, and the emergence of new properties and even new objects as a result of spontaneous symmetry breaking.
Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of the probability calculus and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that an agent’s degrees of belief be coherent.
Instrumentalism about the quantum state is the view that this mathematical object does not serve to represent a component of (non-directly observable) reality, but is rather a device solely for making predictions about the results of experiments. One honest way to be such an instrumentalist is a) to take an ensemble view (= frequentism about quantum probabilities), whereby the state represents predictions for measurement results on ensembles of systems, but not individual systems and b) to assign some specific level for the quantum/classical cut.