This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
How violently do two quantum operators disagree? Different subfields of physics feature different notions of incompatibility: i) In quantum information theory, uncertainty relations are cast in terms of entropies. These entropic uncertainty relations constrain measurement outcomes. ii) Condensed matter and high-energy physics feature interacting quantum many-body systems, such as spin chains. A local perturbation, such as a Pauli operator on one side of a chain, preads through many-body entanglement.
Optimally encoding classical information in a quantum system is one of the oldest and most fundamental challenges of quantum information theory. Holevo’s bound places a hard upper limit on such encodings, while the Holevo-Schumacher-Westmoreland (HSW) theorem addresses the question of how many classical messages can be “packed” into a given quantum system. In this article, we use Sen’s recent quantum joint typicality results to prove a one-shot multiparty quantum packing lemma generalizing the HSW theorem.
In this talk, I will discuss some interesting connections between Hamiltonian complexity, error correction, and quantum circuits. First, motivated by the Quantum PCP Conjecture, I will describe a construction of a family of local Hamiltonians where the complexity of ground states — even when subject to large amounts of noise — is superpolynomial (under plausible complexity assumptions). The construction is simple, making use of the well-known Feynman-Kitaev circuit Hamiltonian construction.
In the usual paradigm of quantum error correction, the information to be protected can be encoded in a system of abstract qubits or modes. But how does this work for physical information, which cannot be described in this way? Just as direction information cannot be conveyed using a sequence of words if the parties involved do not share a reference frame, physical quantum information cannot be conveyed using a sequence of qubits or modes without a shared reference frame. Covariant quantum error correction is a procedure for protecting such physical information against noise in such a way
The Leggett-Garg (LG) inequalities were introduced, as a temporal parallel of the Bell inequalities, to test macroscopic realism -- the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state. The talk will begin with a review of the LG framework. Unlike the Bell inequalities, the original LG inequalities are only a necessary condition for macrorealism, and are therefore not a decisive test.
We study correlations in fermionic systems with long-range interactions in thermal equilibrium. We prove an upper-bound on the correlation decay between anti-commut-ing operators based on long-range Lieb-Robinson type bounds. Our result shows that correlations between such operators in fermionic long-range systems of spatial dimension $D$ with at most two-site interactions decaying algebraically with the distance with an exponent $\alpha \geq 2\,D$, decay at least algebraically with an exponent arbitrarily close to $\alpha$.
The precise relationship between post-selected classical and
post-selected quantum computation is an open problem in complexity
theory. Post-selection has proven to be a useful tool in uncovering some
of the differences between quantum and classical theories, in
foundations and elsewhere. This is no less true in the area of
computational complexity -- quantum computations augmented with
post-selection are thought to be vastly more powerful than their
classical counterparts. However, the precise reasons why this might be
How does classical chaos affect the generation of quantum entanglement? What signatures of chaos exist at the quantum level and how can they be quantified? These questions have puzzled physicists for a couple of decades now. We answer these questions in spin systems by analytically establishing a connection between entanglement generation and a measure of delocalization of a quantum state in such systems. While delocalization is a generic feature of quantum chaotic systems, it is more nuanced in regular systems.
The spectral gap problem consist in deciding, given a local interaction, whether the corresponding translationally invariant Hamiltonian on a lattice has a spectral gap independent of the system size or not. In the simplest case of nearest-neighbour frustration-free qubit interactions, there is a complete classification. On the other extreme, for two (or higher) dimensional models with nearest-neighbour interactions this problem can be reduced to the Halting Problem, and it is therefore undecidable.
Ubiquitous in the behavior of physical systems is the competition between an energy term E and an entropy term S of their free energy F = E - beta S. These concepts are also relevant for error correction, where the `energy` E is the number of qubits afflicted by an error, the `entropy' S(E) is the logarithm of the number of energy-E failing errors, and beta relates to the probability of each qubit's error. Error-correction schemes with larger minimum free energy have better performance.