This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
Suppose we are given two probability distributions on some N-element set. How many samples do we need to test whether the two distributions are close or far from each other in the L_1 norm? This problem known as Statistical Difference has been extensively studied during the last years in the field of property testing. I will describe quantum algorithms for Statistical Difference problem that provide a polynomial speed up in terms of the query complexity compared to the known classical lower bounds.
Recent results have shown that quantum computers can approximate the value of a tensor network efficiently. These results have initiated a search for tensor networks which contract to computationally interesting quantities. Topological Lattice Field Theories (TLFTs) are one source of such networks; when defined appropriately, networks arising from TLFTs contract to give topological invariants. In this elementary talk, we will define and classify TLFTs which lead to invariants of surfaces, and sketch out the corresponding quantum algorithm.
We introduce a two-body quantum Hamiltonian model of spin-1/2 on a 2D spatial lattice with exact topological degeneracy in all coupling regimes. There exists a gapped phase in which the low-energy sector reproduces an effective color code model. High energy excitations fall into three families of anyonic fermions that turn out to be strongly interacting. The model exhibits a Z_2xZ_2 gauge group symmetry and string-net integrals of motion, which are related to the existence of topological charges that are invisible to moving high-energy fermions.
In recent years, the analysis of boolean functions has arisen as an important theme in theoretical computer science. In this talk I will discuss an extension of the concept of a boolean function to quantum computation. It turns out that many important classical results in the theory of boolean functions have natural quantum analogues. These include property testing of boolean functions; the Goldreich-Levin algorithm for approximately learning boolean functions; and a theorem of Friedgut, Kalai and Naor on the Fourier spectra of boolean functions.
With the aim of proposing feasible, quantum optical realizations of quantum information protocols and minimizing the resource costs in such implementations, we will discuss various, so-called hybrid approaches. These include, for instance, schemes based upon both discrete and continuous quantum variables.
Quantum computation by single-qubit measurements was proposed by Raussendorf and Briegel [PRL 86, 5188] as a potential scheme for implementing quantum computers. It also offers an unusual means of describing unitary transformations. To better understand which measurement-based procedures perform unitary operations, we may consider the following problem: under what circumstances can a measurement-based procedure for a unitary U be found, provided a similar procedure for U which relies on post-selection?
By storing quantum information in the degenerate ground state of a Hamiltonian, it is hoped that it can be made quite robust against noise processes. We will examine this situation, with particular emphasis on the Toric code in 2D, and show how adversarial effects, either perturbations to the Hamiltonian or interactions with an environment, destroy the stored information extremely quickly.
We analyze how quantum complexity poses bounds to the simulation of quantum systems. While methods as Density Functional Theory (DFT) and the Density Matrix Renormalization Group (DMRG) work very well in practice, essentially nothing on the formal requirements is known. In this talk, we consider these methods from a quantum complexity perspective: First, we discuss DFT which encapsulates the difficulty of solving the Schroedinger equation in a universal functional and show that this functional cannot be efficiently computed unless several complexity classes collapse.
We study the possibility of a self-correcting quantum memory based on stabilizer codes with geometrically-local stabilizer generators. We prove that the distance of such stabilizer codes in D dimensions is bounded by O(L^{D-1}) where L is the linear size of the D-dimensional lattice. In addition, we prove that in D=1 and D=2, the energy barrier separating different logical states is upper-bounded by a constant independent of L. This shows that in such systems there is no natural energy dissipation mechanism which prevents errors from accumulating.
The rise of quantum information science has been paralleled by the development of a vigorous research program aimed at obtaining an informational characterization or reconstruction of the quantum formalism, in a broad framework for stochastic theories that encompasses quantum and classical theory, but also a wide variety of other theories that can serve as foils to them.