This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
In the usual paradigm of quantum error correction, the information to be protected can be encoded in a system of abstract qubits or modes. But how does this work for physical information, which cannot be described in this way? Just as direction information cannot be conveyed using a sequence of words if the parties involved do not share a reference frame, physical quantum information cannot be conveyed using a sequence of qubits or modes without a shared reference frame. Covariant quantum error correction is a procedure for protecting such physical information against noise in such a way
The Leggett-Garg (LG) inequalities were introduced, as a temporal parallel of the Bell inequalities, to test macroscopic realism -- the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state. The talk will begin with a review of the LG framework. Unlike the Bell inequalities, the original LG inequalities are only a necessary condition for macrorealism, and are therefore not a decisive test.
We study correlations in fermionic systems with long-range interactions in thermal equilibrium. We prove an upper-bound on the correlation decay between anti-commut-ing operators based on long-range Lieb-Robinson type bounds. Our result shows that correlations between such operators in fermionic long-range systems of spatial dimension $D$ with at most two-site interactions decaying algebraically with the distance with an exponent $\alpha \geq 2\,D$, decay at least algebraically with an exponent arbitrarily close to $\alpha$.
The precise relationship between post-selected classical and
post-selected quantum computation is an open problem in complexity
theory. Post-selection has proven to be a useful tool in uncovering some
of the differences between quantum and classical theories, in
foundations and elsewhere. This is no less true in the area of
computational complexity -- quantum computations augmented with
post-selection are thought to be vastly more powerful than their
classical counterparts. However, the precise reasons why this might be
How does classical chaos affect the generation of quantum entanglement? What signatures of chaos exist at the quantum level and how can they be quantified? These questions have puzzled physicists for a couple of decades now. We answer these questions in spin systems by analytically establishing a connection between entanglement generation and a measure of delocalization of a quantum state in such systems. While delocalization is a generic feature of quantum chaotic systems, it is more nuanced in regular systems.
The spectral gap problem consist in deciding, given a local interaction, whether the corresponding translationally invariant Hamiltonian on a lattice has a spectral gap independent of the system size or not. In the simplest case of nearest-neighbour frustration-free qubit interactions, there is a complete classification. On the other extreme, for two (or higher) dimensional models with nearest-neighbour interactions this problem can be reduced to the Halting Problem, and it is therefore undecidable.
Ubiquitous in the behavior of physical systems is the competition between an energy term E and an entropy term S of their free energy F = E - beta S. These concepts are also relevant for error correction, where the `energy` E is the number of qubits afflicted by an error, the `entropy' S(E) is the logarithm of the number of energy-E failing errors, and beta relates to the probability of each qubit's error. Error-correction schemes with larger minimum free energy have better performance.
Despite considerable effort, magic state distillation remains one of the leading candidates to achieve universal fault-tolerant quantum computation. However, when analyzing magic state distillation schemes, it is often assumed that gates belonging to the Clifford group can be implemented perfectly. In many current quantum technologies, two-qubit Cliffords gates are amongst the noisiest components of quantum computers. In this talk I will present a new scheme for preparing magic states with very low overhead that uses flag qubits.
I will present a method for the implementation of a universal set of fault-tolerant logical gates using homological product codes. In particular, I will show how one can fault-tolerantly map between different encoded representations of a given logical state, enabling the application of different classes of transversal gates belonging to the underlying quantum codes. This allows for the circumvention of no-go results pertaining to universal sets of transversal gates and provides a general scheme for fault-tolerant computation while keeping the stabilizer generators of the code sparse.
Performing a quantum adiabatic optimization (AO) algorithm with the time-dependent Hamiltonian H(t) requires one to have some idea of the spectral gap γ(t) of H(t) at all times t.