This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
A new ensemble interpretation of quantum mechanics is proposed according to which the ensemble associated to a quantum state really exists: it is the ensemble of all the systems in the same quantum state in the universe. Individual systems within the ensemble have microscopic states, described by beables. The probabilities of quantum theory turn out to be just ordinary relative frequencies probabilities in these ensembles.
We begin with a fundamental approach to quantum mechanics based on the unitary representations of the group of diffeomorphisms of physical space (and correspondingly, self-adjoint representations of a local current algebra). From these, various classes of quantum configuration spaces arise naturally.
Ideal measurements are described in quantum mechanics textbooks by two postulates: the collapse of the wave packet and BornÃ¢ÂÂs rule for the probabilities of outcomes. The quantum evolution of a system then has two components: a unitary (Hamiltonian) evolution in between measurements and non-unitary one when a measurement is performed. This situation was considered to be unsatisfactory by many people, including Einstein, Bohr, de Broglie, von Neumann and Wigner, but has remained unsolved to date.
I consider systems that consist of a few hot and a few cold two-level systems and define heat engines as unitaries that extract energy. These unitaries perform logical operations whose complexity depends on both the desired efficiency and the temperature quotient. I show cases where the optimal heat engine solves a hard computational task (e.g. an NP-hard problem) . Heat engines can also drive refrigerators and use the temperature difference between two systems for cooling a third one. I argue that these triples of systems define a classification of thermodynamic resources .
Usually, quantum theory (QT) is introduced by giving a list of abstract mathematical postulates, including the Hilbert space formalism and the Born rule. Even though the result is mathematically sound and in perfect agreement with experiment, there remains the question of why this formalism is a natural choice, and how QT could possibly be modified in a consistent way. My talk is on recent work with Lluis Masanes, where we show that five simple operational axioms actually determine the formalism of QT uniquely. This is based to a large extent on Lucien Hardy's seminal work.
We present a new formulation of quantum mechanics for closed systems like the universe using an extension of familiar probability theory that incorporates negative probabilities. Probabilities must be positive for alternative histories that are the basis of settleable bets. However, quantum mechanics describes alternative histories are not the basis for settleable bets as in the two-slit experiment. These alternatives can be assigned extended probabilities that are sometimes negative. We will compare this with the decoherent (consistent) histories formulation of quantum theory.
The nature of antimatter is examined in the context of algebraic quantum
field theory. It is shown that the notion of antimatter is more general
than that of antiparticles. Properly speaking, then, antimatter is not
matter made up of antiparticles --- rather, antiparticles are particles
made up of antimatter. We go on to discuss whether the notion of antimatter
is itself completely general in quantum field theory. Does the
matter-antimatter distinction apply to all field theoretic systems? The
Recently rediscovered results in the theory of partial differential equations show that for free fields, the properties of the field in an arbitrarily small volume of space, traced through eternity,
determine completely the field everywhere at all times. Over finite
times, the field is determined in the entire region spanned by the intersection of the future null cone of the earliest event and the past
null cone of the latest event. Thus this paradigm of classical field
Symmetric monoidal categories provide a convenient and enlightening framework within which to compare and contrast physical theories on a common mathematical footing. In this talk we consider two theories: stabiliser qubit quantum mechanics and the toy bit theory proposed by Rob Spekkens. Expressed in the categorical framework the two theories look very similar mathematically, reflecting their common physical features.
Quantum mechanics does not allow us to measure all possible combinations of observables on one system. Even in the simplest case of two observables we know, that measuring one of the observables changes the system in such way, that the other measurement will not give us desired precise information about the state of the system.