This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
The Church-Turing thesis is one of the pillars of computer science; it postulates that every classical system has equivalent computability power to the so-called Turing machine. While this thesis is crucial for our understanding of computing devices, its implications in other scientific fields have hardly been explored. What if we consider the Church-Turing thesis as a law of nature?
I will present a new approach to information-theoretic foundations of quantum theory, that does not rely on probability theory, spectral theory, or Hilbert spaces. The direct nonlinear generalisations of quantum kinematics and dynamics are constructed using quantum information geometric structures over algebraic states of W*-algebras (quantum relative entropies and Poisson structure). In particular, unitary evolutions are generalised to nonlinear hamiltonian flows, while Lueders’ rules are generalised to constrained relative entropy maximisations.
Interferometers capture a basic mystery of quantum mechanics: a single particle can exhibit wave behavior, yet that wave behavior disappears when one tries to determine the particle's path inside the interferometer. This idea has been formulated quantitatively as an inequality, e.g., by Englert and Jaeger, Shimony, and Vaidman, which upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated.
In quantum theory, people have thought for some while about the problem of how to estimate the decoherence of a quantum channel from classical data gained in measurements. Applications of these developments include security criteria for quantum key distribution and tests of decoherence models. In this talk, I will present some ideas for how to interpret the same classical data to make statements about decoherence in cases where nature is not necessarily described by quantum theory. This is work in progress in collaboration with many people.
The role of measurement induced disturbance in weak measurements is of central importance for the interpretation of the weak value. Uncontrolled disturbance can interfere with the postselection process and make the weak value dependent on the details of the measurement process. Here we develop the concept of a generalized weak measurement for classical and quantum mechanics. The two cases appear remarkably similar, but we point out some important differences. A priori it is not clear what the correct notion of disturbance should be in the context of weak measurements.
A persistent mystery of quantum theory is whether it admits an interpretation that is realist, self-consistent, model-independent, and unextravagant in the sense of featuring neither multiple worlds nor pilot waves. In this talk, I will present a new interpretation of quantum theory -- called the minimal modal interpretation (MMI) -- that aims to meet these conditions while also hewing closely to the basic structure of the theory in its widely accepted form.
Weak measurement is increasingly acknowledged as an important theoretical and experimental tool. Weak values- the results of weak measurements- are often used to understand seemingly paradoxical quantum behavior. Until now however, it was not known how to perform a weak non-local measurement of a general operator. Such a procedure is necessary if we are to take the associated `weak values' seriously as a physical quantity. We propose a novel scheme for performing non-local weak measurement which is based on the principle of quantum erasure.
I will outline a new topological foundation for computation, and show how it gives rise to a unified treatment of classical encryption and quantum teleportation, and a strong classical model for many quantum phenomena. This work connects to some other interesting topics, including quantum field theory, classical combinatorics, thermodynamics, Morse theory and higher category theory, which I will introduce in an elementary way.
There has been renewed interest in the effect that pre and postselection has on the foundations of quantum theory. Often, but not solely, in conjunction with weak measurement, pre and postselection scenarios are said to simultaneous create and resolve paradoxes. These paradoxes are said to be profound quandaries which bring us closer to the resolving the mysteries of the quantum. Here I was show that the same effects are present in classical physics when postselection and disturbance are allowed.
A fundamental question in trying to understand the world -- be it classical or quantum -- is why things happen. We seek a causal account of events, and merely noting correlations between them does not provide a satisfactory answer. Classical statistics provides a better alternative: the framework of causal models proved itself a powerful tool for studying causal relations in a range of disciplines. We aim to adapt this formalism to allow for quantum variables and in the process discover a new perspective on how causality is different in the quantum world.