This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
We study the possibility of a self-correcting quantum memory based on stabilizer codes with geometrically-local stabilizer generators. We prove that the distance of such stabilizer codes in D dimensions is bounded by O(L^{D-1}) where L is the linear size of the D-dimensional lattice. In addition, we prove that in D=1 and D=2, the energy barrier separating different logical states is upper-bounded by a constant independent of L. This shows that in such systems there is no natural energy dissipation mechanism which prevents errors from accumulating.
The rise of quantum information science has been paralleled by the development of a vigorous research program aimed at obtaining an informational characterization or reconstruction of the quantum formalism, in a broad framework for stochastic theories that encompasses quantum and classical theory, but also a wide variety of other theories that can serve as foils to them.
I would like to provide a short, possibly elementary, introduction to the problem of computing string amplitudes at higher genus for superstrings. Essentially, I will recall which is the mathematical problem in defining the path integral measure (which has a well defined algebraic geometry realization for bosonic strings) and the solution proposed by d~@~YHocker and Phong for the genus 2 case. Their main results are the chiral splitted form of the measure, and its explicit expression in genus two.
In this talk I will give an introduction to the simulation of quantum many-body systems using the so-called tensor networks. After a brief historical review, I will introduce the basics on tensor network representations of quantum states, and will explain some recent developments. In particular, in the last part of my talk I will focus on recent results obtained in the simulation of 2-dimensional quantum lattice systems of infinite size.
A quantum channel models a physical process in which noise is added to a quantum system via interaction with its environment. Protecting quantum systems from such noise can be viewed as an extension of the classical communication problem introduced by Shannon sixty years ago. A fundamental quantity of interest is the quantum capacity of a given channel, which measures the amount of quantum information which can be protected, in the limit of many transmissions over the channel.
I will report on efforts to implement a new method for simulating concatenated quantum error correction, where many levels of concatenation are simulated together explicitly. That is, the approach involves a Monte Carlo simulation of a noisy circuit involving many thousands of qubits, rather than tens of qubits previously. The new approach allows the threshold and resource usage of concatenated quantum error correction to be determined more accurately than before.
One of the quintessential features of quantum information is its exclusivity, the inability of strong quantum correlations to be shared by many physical systems. Likewise, complementarity has a similar status in quantum mechanics as the sine qua non of quantum phenomena. We show that this is no coincidence, and that the central role of exclusivity in quantum information theory stems from the phenomenon of complementarity.
We discuss two methods to encode one qubit into six physical qubits. Each of our two examples corrects an arbitrary single-qubit error. Our first example is a degenerate six-qubit quantum error-correcting code. We explicitly provide the stabilizer generators, encoding circuits, codewords, logical Pauli operators, and logical CNOT operator for this code. We also show how to convert this code into a non-trivial subsystem code that saturates the subsystem Singleton bound.
In an effort to better understand the class of operations on a bipartite system which preserve positivity of partial transpose (PPT operations), we have investigated the (non-asymptotic) transformation of pure states to pure states by operations in this class. Under local operations and classical communication (LOCC) Nielsen\'s majorization criterion provides a necessary and sufficient condition for such a transformation.
What does a fractional quantum Hall liquid and Kitaev\'s proposals for topological quantum computation have in common? It turns out that they are physical systems that exhibit degenerate ground states with properties seemingly different than ordinary (Landau-type) vacua, such as the ground states of a Heisenberg magnet. For example, those (topologically quantum ordered)states cannot be characterized by (local) order parameters such as magnetization. How does one characterize this new order?