Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
David Deutsch re-formulated the Church-Turing thesis as a physical principle, asserting that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means". Such principle can be regarded as a new theoretical paradigm, whereby the entire Physics is emerging from a quantum computation. But for a theory to be a good one, it must explain a large class of phenomena based on few general principles.
A central question in our understanding of the physical world is how our knowledge of the whole relates to our knowledge of the individual parts. One aspect of this question is the following: to what extent does ignorance about a whole preclude knowledge of at least one of its parts? Relying purely on classical intuition, one would certainly be inclined to conjecture that a strong ignorance of the whole cannot come without significant ignorance of at least one of its parts. Indeed, we show that this reasoning holds in any non-contextual hidden variable model (NC-HV).
We will explore generalisations of the Shannon and von Neumann entropy to other probabilistic theories, and their connection to the principle of information causality. We will also investigate the link between information causality and non-local games, leading to a new quantum bound on computing the inner product non-locally.
In 1964, John Bell proved that independent measurements on entangled quantum states lead to correlations that cannot be reproduced using local hidden variables. The core of his proof is that such distributions violate some logical constraints known as Bell inequalities. This remarkable result establishes the non-locality of quantum physics. Bell's approach is purely qualitative. This naturally leads to the question of quantifying quantum physics' non-locality. We will specifically consider two quantities introduced for this purpose.
We present Ã¢ÂÂguess your neighbor inputÃ¢ÂÂ (GYNI), a multipartite nonlocal task in which each player must guess the input received by his neighbor. We show that quantum correlations do not perform better than classical ones at this task, for any prior distribution of the inputs. There exist, however, input distributions for which general no-signalling correlations can outperform classical and quantum correlations. Some of the Bell inequalities associated to our construction correspond to facets of the local polytope.
Higher loop amplitudes and non-minimal formalism
According to quantum theory, it is impossible to prepare the state of a system such that the outcome of any projective measurement on the system can be predicted with certainty. This limitation of predictive power, which is known as the uncertainty principle, is one of the main distinguishing properties of quantum theory when compared to classical theories. In this talk, I will discuss the implications of this principle to foundational questions.
Entanglement provides a coherent view of the physical origin of randomness and the growth and decay of correlations, even in macroscopic systems exhibiting few traditional quantum hallmarks. It helps explain why the future is more uncertain than the past, and how correlations can become macroscopic and classical by being redundantly replicated throughout a system's environment. The most private information, exemplified by a quantum eraser experiment, exists only transiently: after the experiment is over no record remains anywhere in the universe of what "happened".
The quantum mechanical state vector is a complicated object. In particular, the amount of data that must be given in order to specify the state vector (even approximately) increases exponentially with the number of quantum systems. Does this mean that the universe is, in some sense, exponentially complicated? I argue that the answer is yes, if the state vector is a one-to-one description of some part of physical reality. This is the case according to both the Everett and Bohm interpretations.