Since 2002 Perimeter Institute has been recording seminars, conference talks, and public outreach events using video cameras installed in our lecture theatres. Perimeter now has 7 formal presentation spaces for its many scientific conferences, seminars, workshops and educational outreach activities, all with advanced audio-visual technical capabilities. Recordings of events in these areas are all available On-Demand from this Video Library and on Perimeter Institute Recorded Seminar Archive (PIRSA). PIRSA is a permanent, free, searchable, and citable archive of recorded seminars from relevant bodies in physics. This resource has been partially modelled after Cornell University's arXiv.org.
A fundamental question in trying to understand the world -- be it classical or quantum -- is why things happen. We seek a causal account of events, and merely noting correlations between them does not provide a satisfactory answer. Classical statistics provides a better alternative: the framework of causal models proved itself a powerful tool for studying causal relations in a range of disciplines. We aim to adapt this formalism to allow for quantum variables and in the process discover a new perspective on how causality is different in the quantum world.
Spatially coupled LDPC were introduced by Felström and Zigangirov in 1999. They might be viewed in the following way, take several several instances of a certain LDPC code family, arrange them in a row and then mix the edges of the codes randomly among neighboring layers. Moreover fix the bits of the first and last layers to zero. It has soon been found out that iterative decoding behaves much better for this code than for the original LDPC code.
If one's goal is large-scale quantum computation, ultimately one wishes to minimize the amount of time, number of qubits, and qubit connectivity required to outperform a classical system, all while assuming some physically reasonable gate error rate. We present two examples of such an overhead study, focusing on the surface code with and without long-range interactions.
All examples of quantum LDPC codes known to this date suffer from a poor distance scaling limited by the square-root of the code length. This is in a sharp contrast with the classical case where good LDPC codes are known that combine constant encoding rate and linear distance. In this talk I will describe the first family of good quantum "almost LDPC" codes. The new codes have a constant encoding rate, linear distance, and stabilizers acting on at most square root of n qubits, where n is the code length.
I will describe a new class of topological quantum error correcting codes with surprising features. The constructions is based on color codes: it preserves their unusual transversality properties but removes important drawbacks. In 3D, the new codes allow the effectively transversal implementation of a universal set of gates by gauge fixing, while error-dectecting measurements involve only 4 or 6 qubits. Furthermore, they do not require multiple rounds of error detection to achieve fault-tolerance.
We study a class of 4d N=1 SCFTs obtained from partial compactifications of 6d N=(2, 0) theory on a Riemann surface with punctures. We identify theories corresponding to curves with general type of punctures through nilpotent Higgsing and Seiberg dualities. The `quiver tails' of N=1 class S theories turn out to differ significantly from N=2 counterpart and have interesting properties. Various dual descriptions for such a theory can be found by using colored pair-of-pants decompositions. Especially, we find N=1 analog of Argyres-Seiberg duality for the SQCD with various gauge groups.
After the 7 and 8 TeV LHC runs, we have no conclusive evidence of physics beyond the Standard Model, leading us to suspect that even if new physics is discovered during run II, the number of signal events may be limited, making it crucial to optimize measurements for the case of low statistics. I will argue that phase space correlations between subsequent on-shell decays in a cascade contain additional information compared to commonly used kinematic variables, and this can be used to significantly improve the precision and accuracy of mass measurements.
This talk is divided into two parts. In the first part, I discuss a scheme of fault-tolerant quantum computation for a web-like physical architecture of a quantum computer. Small logical units of a few qubits (realized in ion traps, for example) are linked via a photonic interconnect which provides probabilistic heralded Bell pairs [1]. Two time scales compete in this system, namely the characteristic decoherence time T_D and the typical time T_E it takes to provide a Bell pair.