Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
Entanglement provides a coherent view of the physical origin of randomness and the growth and decay of correlations, even in macroscopic systems exhibiting few traditional quantum hallmarks. It helps explain why the future is more uncertain than the past, and how correlations can become macroscopic and classical by being redundantly replicated throughout a system's environment. The most private information, exemplified by a quantum eraser experiment, exists only transiently: after the experiment is over no record remains anywhere in the universe of what "happened".
The quantum mechanical state vector is a complicated object. In particular, the amount of data that must be given in order to specify the state vector (even approximately) increases exponentially with the number of quantum systems. Does this mean that the universe is, in some sense, exponentially complicated? I argue that the answer is yes, if the state vector is a one-to-one description of some part of physical reality. This is the case according to both the Everett and Bohm interpretations.
Non-contextuality is presented as an abstraction and at the same time generalisation of locality. Rather than in correlations, the underlying physical model leaves its signature in collections of expectation values, which are contrained by inequalities much like Bell's or Tsirelson's inequalities. These non-contextual inequalities reveal a deep connection to classic topics in graph theory, such as independence numbers, Lovasz numbers and other graph parameters.
Three-partite quantum systems exhibit interesting features that are absent in bipartite ones. Several instances are classics by now: the GHZ argument, the W state, the UPB bound entangled states, Svetlichny inequalities... In this talk, I shall discuss some on-going research projects that we are pursuing in my group (in collaboration, or in friendly competition, with other groups) and that involve three-partite entanglement or non-locality: * Activation of non-locality in networks. * Device-independent assessment of the entangling power of a measurement.
In my talk I raise the question of the fundamental limits to the size of thermal machines - refrigerators, heat pumps and work producing engines - and I will present the smallest possible ones. I will also discuss the issue of a possible complementarity between size and efficiency and show that even the smallest machines could be maximally efficient. Finally I will present a new point of view over what is work and what do thermal machines actually do.
The talk will focus primarily on recent work with Alexander Wilce in which we show that any locally tomographic composite of a qubit with any finite-dimensional homogeneous self-dual (equivalently Jordan-algebraic) system must be a standard finite-dimensional quantum (i.e. $C^*$-algebraic) system. I may touch on work in progress with collaborators on composites of arbitrary homogeneous self-dual systems.
I will discuss what we know about creating randomness within physics. Although quantum theory prescribes completely random outcomes to particular processes, could it be that within a yet-to-be-discovered post-quantum theory these outcomes are predictable? We have recently shown that this is not possible, using a very natural assumption. In the present talk, I will discuss some recent progress towards relaxing this assumption, providing arguably the strongest evidence yet for truly random processes in our world.
In recent years, a number of observations have highlighted anomalies that might be explained by invoking dark matter annihilation. The excess of high energy positrons in cosmic rays reported by the PAMELA experiment is only one of the most prominent examples of such anomalies. Models where dark matter annihilates offer an attractive possibility to explain these
Much of the recent progress in understanding quantum theory has been achieved within an operational approach. Within this context quantum mechanics is viewed as a theory for making probabilistic predictions for measurement outcomes following specified preparations. However, thus far some of the essential elements of the theory Ã¢ÂÂ space, time and causal structure Ã¢ÂÂ elude such an operational formulation and are assumed to be fixed.