This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
To best distinguish between classical and non-classical models of nature requires a good notion of classicality. I will argue that noncontextuality is a good candidate for this notion. Until now, certain theoretical and experimental roadblocks have stood in the way of a test of noncontextuality which is free of unattainable experimental idealizations. I will present solutions to these roadblocks as well as the results of an experimental test.
One necessity to avoid the measurement problem in quantum mechanics is a clear ontology. Such an ontology is for instance provided by Bohmian mechanics. In the non-relativistic regime, Bohmian mechanics is a theory about particles whose motion is governed by a velocity field. The latter is generated by a wave
Using quantum control in foundational experiments allows new theoretical and experimental possibilities. We show how, e.g., quantum controlling devices reverse a temporal ordering in detection. We consider probing of wave–particle duality in quantum-controlled and the entanglement-assisted delayed-choice experiments. Then we discuss other situations where quantum control may be useful, and finally demonstrate how the techniques we developed are applied to the study of consistency of the classically reasonable requirements.
It is well known - to those who know it - that noise and randomness can enhance signal resolution. I'll present an easy-to-follow example from digital audio that illustrates the way in which adding noise ("dither") prior to measurement enhances the accuracy with which we are able to distinguish the features of the sound or image. I will then explore the way in which the environmental interactions prior to measurement ordinarily characterized as environment-induced decoherence may play a similar role.
We present a formal logic modeling some aspects of the behavior of the quantum measurement process, and study some properties of the models of this logic, from which we deduce some characteristics that any such model should verify. In the case of a Hilbert space of dimension at least 3, we then show that no model can lead to the prediction with certainty of more than one atomic outcome. Moreover, if the Hilbert space is finite dimensional, we can precisely describe the structure of the predictions of any model of our logic.
This talk touches on three questions regarding the ontological status of quantum states using the ontological models
framework: it is assumed that a physical system has some underlying ontic state and that quantum states correspond to probability distributions over these ontic states.
The last decade has seen a wave of characterizations of quantum theory using the formalism of generalized probability theory.
After a brief motivation of this question, the presentation is divided in two parts. We first introduce the principle of quantum information causality, which states the maximum amount of quantum information that a transmitted quantum system can communicate as a function of its Hilbert space dimension, independently of any quantum physical resources previously shared by the communicating parties.
The Church-Turing thesis is one of the pillars of computer science; it postulates that every classical system has equivalent computability power to the so-called Turing machine. While this thesis is crucial for our understanding of computing devices, its implications in other scientific fields have hardly been explored. What if we consider the Church-Turing thesis as a law of nature?
Pure states and pure transformations play a crucial role in most of the recent reconstructions of quantum theory. In the frameworks of general probabilistic theories, purity is defined in terms of probabilistic mixtures and bears an intuitive interpretation of ``maximal knowledge" of the state of the system or of the evolution undergone by it. On the other hand, many quantum features do not need the probabilistic structure of the theory.