This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The distillation of quantum resources such as entanglement and coherence forms one of the most fundamental protocols in quantum information and is of outstanding operational significance, but it is often characterized in the idealized asymptotic limit where an unbounded number of independent and identically distributed copies of a quantum system are available.
The methodology employed in reconstructing quantum theory involves defining a general mathematical framework that frames a landscape of possible theories and then positing principles that uniquely pick out quantum theory. In contrast, many traditional interpretations of quantum theory consider only quantum theory, not a larger space of possible theories. I will defend the modal methodology used in reconstruction by tracing the historical roots of Einstein’s distinction between principle and constructive theories.
Hidden-variables theories account for quantum mechanics in terms of a particular 'equilibrium' distribution of underlying parameters corresponding to the Born rule. A natural question to ask is whether the theory is stable under small perturbations away from equilibrium. We compare and contrast two examples: de Broglie's 1927 pilot-wave theory and Bohm's 1952 reformulation thereof. It is well established that in de Broglie's dynamics initial deviations from equilibrium will relax.
In this talk I show how to systematically classify all possible alternatives to the measurement postulates of quantum theory. All alternative measurement postulates are in correspondence with a representation of the unitary group. I will discuss composite systems in these alternative theories and show that they violate two operational properties: purification and local tomography. This shows that one can derive the measurement postulates of quantum theory from either of these properties. I will discuss the relevance of this result to the field of general probabilistic theories.
In this talk I will discuss recently-identified classes of quantum correlations that go beyond nonlocal classical hidden-variable models equipped with communication. First, in the bipartite scenario, I will focus on so-called instrumental causal networks, which are a primal tool in causal inference. There, I will show that it is possible to “fake” classical causal influences with quantum common causes, in a formal sense quantified by the average causal effect (ACE).
We will discuss recent trapping and cooling experiments with optically levitated nanoparticles [1]. We will report on the cooling of all translational motional degrees of freedom of a single trapped silica particle to 1mK simultaneously at vacuum of 10^{-5} mbar using a parabolic mirror. We will further report on the squeezing of a thermal motional state of the trapped particle by rapid switching of the trap frequency [2].
We discuss the role of contextuality within quantum fluctuation theorems, in the light of a recent no-go result by Perarnau et al. We show that any fluctuation theorem reproducing the two-point measurement scheme for classical states either admits a notion of work quasi-probability or fails to describe protocols exhibiting contextuality.
Conventional quantum processes are described by quantum circuits, that represent evolutions of states of systems from input to output. In this seminar we consider transformations of an input circuit to an output circuit, which then represent the transformation of quantum evolutions. At this level, all the processes complying to admissibility conditions have in principle a physical realization scheme.
Quantum gravity has many conceptual problems. Amongst the most well-known is the "Problem of Time": gravitational observables are global in time, while we would really like to obtain probabilities for processes taking us from an observable at one time to another, later one. Tackling these questions using relationalism will be the preferred strategy during this talk.
Quantum mechanics can be seen as a set of instructions how to calculate probabilities by associating mathematical objects to physical procedures, like preparation, manipulation, and measurement of a system. Quantum theory then yields probabilities which are neutral with respect to its use, e.g., in a Bayesian or a frequentistic way. We investigate a different approach to quantum theory and physical theories in general, in which we aim for subjective predictions in the Bayesian sense. This gives a structure different from the operational framework of general probabilistic theories.