This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
I will report on efforts to implement a new method for simulating concatenated quantum error correction, where many levels of concatenation are simulated together explicitly. That is, the approach involves a Monte Carlo simulation of a noisy circuit involving many thousands of qubits, rather than tens of qubits previously. The new approach allows the threshold and resource usage of concatenated quantum error correction to be determined more accurately than before.
One of the quintessential features of quantum information is its exclusivity, the inability of strong quantum correlations to be shared by many physical systems. Likewise, complementarity has a similar status in quantum mechanics as the sine qua non of quantum phenomena. We show that this is no coincidence, and that the central role of exclusivity in quantum information theory stems from the phenomenon of complementarity.
We discuss two methods to encode one qubit into six physical qubits. Each of our two examples corrects an arbitrary single-qubit error. Our first example is a degenerate six-qubit quantum error-correcting code. We explicitly provide the stabilizer generators, encoding circuits, codewords, logical Pauli operators, and logical CNOT operator for this code. We also show how to convert this code into a non-trivial subsystem code that saturates the subsystem Singleton bound.
In an effort to better understand the class of operations on a bipartite system which preserve positivity of partial transpose (PPT operations), we have investigated the (non-asymptotic) transformation of pure states to pure states by operations in this class. Under local operations and classical communication (LOCC) Nielsen\'s majorization criterion provides a necessary and sufficient condition for such a transformation.
What does a fractional quantum Hall liquid and Kitaev\'s proposals for topological quantum computation have in common? It turns out that they are physical systems that exhibit degenerate ground states with properties seemingly different than ordinary (Landau-type) vacua, such as the ground states of a Heisenberg magnet. For example, those (topologically quantum ordered)states cannot be characterized by (local) order parameters such as magnetization. How does one characterize this new order?
Wigner-Dirac relativistic quantum theory is applied to decay laws of an unstable particle in different reference frames. It is shown that decay slows down from the point of view of the moving observer, as expected. However, small deviations from Einstein\'s time dilation formula are also found. The origin of these deviations is discussed, as well as possibilities for their experimental detection.
The Hamiltonian of traditionally adopted detector models features out of diagonal elements between the vacuum and the one particle states of the field to be detected. We argue that reasonably good detectors, when written in terms of fundamental fields, have a more trivial response on the vacuum. In particular, the model configuration ``detector in its ground state + vacuum of the field\' generally corresponds to a stable bound state of the underlying theory (e.g.
The talk concerns a generalization of the concept of a minimum uncertainty state to the finite dimensional case. Instead of considering the product of the variances of two complementary observables we consider an uncertainty relation involving the quadratic Renyi entropies summed over a full set of mutually unbiased bases (MUBs).
Coin flipping by telephone (Blum \'81) is one of the most basic cryptographic tasks of two-party secure computation. In a quantum setting, it is possible to realize (weak) coin flipping with information theoretic security. Quantum coin flipping has been a longstanding open problem, and its solution uses an innovative formalism developed by Alexei Kitaev for mapping quantum games into convex optimization problems.
Decoherence attempts to explain the emergent classical behaviour of a
quantum system interacting with its quantum environment. In order to
formalize this mechanism we introduce the idea that the information
preserved in an open quantum evolution (or channel) can be
characterized in terms of observables of the initial system. We use
this approach to show that information which is broadcast into many
parts of the environment can be encoded in a single observable. This