This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
I will describe some connections between the Eigenstate Thermalization Hypothesis (ETH), the entanglement structure of generic excited eigenstates of chaotic quantum systems ("EE", arXiv:1906.04295), and the "bound on chaos" limiting the growth rate of the out-of-time-order four-point correlator in such systems ("OTOC", arXiv:1906.10808).
In quantum error correcting codes, there is a distinction
between coherent and incoherent noise. Coherent noise can cause the
average infidelity to accumulate quadratically when a fixed channel is
applied many times in succession, rather than linearly as in the case
of incoherent noise. I will present a proof that unitary single qubit
noise in the 2D toric code with minimum weight decoding is mapped to
less coherent logical noise, and as the code size grows, the coherence
of the logical noise channel is suppressed. In the process, I will
The out-of-time-ordered correlator (OTOC) and entanglement are two physically motivated and widely used probes of the ``scrambling'' of quantum information, which has drawn great interest recently in quantum gravity and many-body physics. By proving upper and lower bounds for OTOC saturation on graphs with bounded degree and a lower bound for entanglement on general graphs, we show that the time scales of scrambling as given by the growth of OTOC and entanglement entropy can be asymptotically separated in a random quantum circuit model defined on graphs with a tight bottleneck.
We study approximate quantum low-density parity-check (QLDPC) codes, which are approximate quantum error-correcting codes specified as the ground space of a frustration-free local Hamiltonian, whose terms do not necessarily commute. Such codes generalize stabilizer QLDPC codes, which are exact quantum error-correcting codes with sparse, low-weight stabilizer generators (i.e. each stabilizer generator acts on a few qubits, and each qubit participates in a few stabilizer generators).
One of the central problems in the study of quantum resource theories is to provide a given resource with an operational meaning, characterizing physical tasks relevant to information processing in which the resource can give an explicit advantage over all resourceless states. We show that this can always be accomplished for all convex resource theories. We establish in particular that any resource state enables an advantage in a channel discrimination task, allowing for a strictly greater success probability than any state without the given resource.
How violently do two quantum operators disagree? Different subfields of physics feature different notions of incompatibility: i) In quantum information theory, uncertainty relations are cast in terms of entropies. These entropic uncertainty relations constrain measurement outcomes. ii) Condensed matter and high-energy physics feature interacting quantum many-body systems, such as spin chains. A local perturbation, such as a Pauli operator on one side of a chain, preads through many-body entanglement.
Optimally encoding classical information in a quantum system is one of the oldest and most fundamental challenges of quantum information theory. Holevo’s bound places a hard upper limit on such encodings, while the Holevo-Schumacher-Westmoreland (HSW) theorem addresses the question of how many classical messages can be “packed” into a given quantum system. In this article, we use Sen’s recent quantum joint typicality results to prove a one-shot multiparty quantum packing lemma generalizing the HSW theorem.
In this talk, I will discuss some interesting connections between Hamiltonian complexity, error correction, and quantum circuits. First, motivated by the Quantum PCP Conjecture, I will describe a construction of a family of local Hamiltonians where the complexity of ground states — even when subject to large amounts of noise — is superpolynomial (under plausible complexity assumptions). The construction is simple, making use of the well-known Feynman-Kitaev circuit Hamiltonian construction.
In the usual paradigm of quantum error correction, the information to be protected can be encoded in a system of abstract qubits or modes. But how does this work for physical information, which cannot be described in this way? Just as direction information cannot be conveyed using a sequence of words if the parties involved do not share a reference frame, physical quantum information cannot be conveyed using a sequence of qubits or modes without a shared reference frame. Covariant quantum error correction is a procedure for protecting such physical information against noise in such a way
The Leggett-Garg (LG) inequalities were introduced, as a temporal parallel of the Bell inequalities, to test macroscopic realism -- the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state. The talk will begin with a review of the LG framework. Unlike the Bell inequalities, the original LG inequalities are only a necessary condition for macrorealism, and are therefore not a decisive test.