This series covers all areas of research at Perimeter Institute, as well as those outside of PI's scope.
Alan Turing was one of our great 20th century
mathematicians, and a pioneer of computer science. However, he may best be
remembered as one of the leading code breakers of Bletchley Park during World
War II. It was Turing's brilliant insights and mathematical mind that helped to
break Enigma, the apparently unbreakable code used by the German military. We
present a history of both Alan Turing and the Enigma, leading up to this
fascinating battle of man against machine - including a full demonstration of
Superfluidity and superconductivity are two remarkable
phenomena in which, at low temperatures, materials abruptly gain the ability to
flow without friction. Microscopic quantum theories of these phases of matter
were constructed in blockbuster papers of Lev Landau (1940) and John Bardeen,
Leon Cooper, and J. Robert Schrieffer
(1957). The actual explanation of
the flow, however, is rooted in a
Einstein paper of 1924 that introduces a condensate, a quantum configuration describing a finite
Gamma-ray bursts are
the most luminous and energetic explosions known in the universe. They
appear in two varieties: long- and short-duration. The long GRB
result from the core-collapse of massive stars, but until recently the origin
of the short GRBs was shrouded in mystery. In this talk I will present
several lines of evidence that point to the merger of compact objects binaries
(NS-NS and/or NS-BH) as the progenitor systems of short GRBs. Within this
Despite being one of the most abundant constituents of the Universe and more than a half a century of study, some of the most fundamental properties of the neutrino have only been recently uncovered, and others still remain unresolved. I will discuss important developments in the phenomenon of neutrino oscillations, a transmutation process that allows neutrinos to change between three types as they propagate in time.
"A well constructed theory is in some respects undoubtedly an artistic production." - Sir Ernest Rutherford
"Design is the synthesis of form and content."-Paul Rand
On the surface, the scientific method (primarily analytic) and design methodologies (primarily synthetic) seem to be quite different processes but there is considerable overlap and communicating science involves a blend of both. Scientists tend to use a scientific approach when
In the past few years, optical cooling and manipulating of macroscopic objects, such as micro-mirrors and cantilevers has developed into an active field of research. In mechanical systems, the oscillator is attached to its suspension, a thermal contact that limits the motion isolation. On the other hand, when these small objects are levitated using the radiation pressure force of lasers, the excellent thermal isolation even at room temperatures helps produce very sensitive force detectors, and eventually quantum transducers for quantum computation purposes.
When a large number of quantum mechanical particles are put together and allowed to interact, various condensed matter phases emerge with macroscopic quantum properties. While conventional quantum phases like superfluids or quantum magnets can be understood as a simple collection of
Shor's algorithm can be a meaningful test for experimental quantum processing systems, when suitably realized. I present results from a recent implemenation of quantum factoring using trapped ion qubits, demonstrating feed-forward control, use of quantum memory during computation, and cascaded three-qubit gates. Such capabilities are necessary ingredients for a future large-scale,
fault-tolerant quantum computing system.
The direct detection of gravitational waves promises to open up a new spectrum that is otherwise mostly closed to electromagnetically based astronomical observations. Detecting gravitational waves from binary black holes and neutron stars, as well as estimating their parameters, requires a sufficiently accurate prediction for the expected waveform signal.
While quantum measurement remains the central philosophical conundrum of quantum mechanics, it has recently grown into a respectable (read: experimental!) discipline as well. New perspectives on measurement have grown out of new technological possibilities, but also out of
attempts to design systems for quantum information processing, which promise to be exponentially more powerful than any possible classical computer. I will try to give a flavour about some of these perspectives, focussing largely on a particular paradigm known as "weak measurement."