This series covers all areas of research at Perimeter Institute, as well as those outside of PI's scope.
There are two notions that play a central role in the mathematical theory of computation. One is that of a computable problem, i.e., of a problem that can, in principle, be solved by an (idealized) computer. It is known that there exist problems that \'have answers\', but for which those answers are not computable. The other is that of the difficulty of a computation, i.e. of the number of (idealized) steps required actually to carry out that computation.
At a very basic level, physics is about what we can say about propositions like \'A has a value in S\' (or \'A is in S\' for short), where A is some physical quantity like energy, position, momentum etc. of a physical system, and S is some subset of the real line. In classical physics, given a state of the system, every proposition of the form \'A is in S\' is either true or false, and thus classical physics is realist in the sense that there is a \'way things are\'. In contrast to that, quantum theory only delivers a probability of \'A is in S\' being true.
It is common to assert that the discovery of quantum theory overthrew our classical conception of nature. But what, precisely, was overthrown? Providing a rigorous answer to this question is of practical concern, as it helps to identify quantum technologies that outperform their classical counterparts, and of significance for modern physics, where progress may be slowed by poor physical intuitions and where the ability to apply quantum theory in a new realm or to move beyond quantum theory necessitates a deep understanding of the principles upon which it is based.
The history of human knowledge is often highlighted by our efforts to explore beyond our apparent horizon. In this talk, I will describe how this challenge has now evolved into our quest to understand the physics at/beyond the cosmological horizon, some twenty orders of magnitude above Columbus\' original goal.
A convergence of climate, resource, technological, and economic stresses gravely threaten the future of humankind. Scientists have a special role in humankind\\\'s response, because only rigorous science can help us understand the complexities and potential consequences of these stresses. Diminishing the threat they pose will require profound social, institutional, and technological changes -- changes that will be opposed by powerful status-quo special interests.
The Achilles\\\' heel of quantum information processors is the fragility of quantum states and processes. Without a method to control imperfection and imprecision of quantum devices, the probability that a quantum computation succeed will decrease exponentially in the number of gates it requires. In the last ten years, building on the discovery of quantum error correction, accuracy threshold theorems were proved showing that error can be controlled using a reasonable amount of resources as long as the error rate is smaller than a certain threshold.
The basic problem of much of condensed matter and high energy physics, as well as quantum chemistry, is to find the ground state properties of some Hamiltonian. Many algorithms have been invented to deal with this problem, each with different strengths and limitations. Ideas such as entanglement entropy from quantum information theory and quantum computing enable us to understand the difficulty of various problems.
As physicists, we have become accustomed to the idea that a theory\\\'s content is always most transparent when written in coordinate-free language. But sometimes the choice of a good coordinate system is very useful for settling deep conceptual issues. Think of how Eddington-Finkelstein coordinates settled the longstanding question of whether the event horizon of a Schwarzschild black hole corresponds to a real spacetime singularity or not.
After almost a century of observations, the ultra-high energy sky has finally displayed an anisotropic distribution. A significant correlation between the arrival directions of ultra-high cosmic rays measured by the Pierre Auger Observatory and the distribution of nearby active galactic nuclei signals the dawn of particle astronomy. These historic results have important implications to both astrophysics and particle physics.
The Cosmic Microwave Background (CMB) consists of a bath of photons
emitted when the universe was 380,000 years old. Carrying the imprint
of primordial fluctuations that seeded the formation of structure in
the universe, the CMB is one of the most valuable known tools for
studying the early universe. In our modern, post WMAP era, the utility
of studying temperature anisotropies in the CMB is clear and much of
the work has been done. I will describe two exciting new directions in
©2012 Institut Périmètre de Physique Théorique