This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
The gauge color code is a quantum error-correcting code with local syndrome measurements that, remarkably, admits a universal transversal gate set without the need for resource-intensive magic state distillation. A result of recent interest, proposed by Bombin, shows that the subsystem structure of the gauge color code admits an error-correction protocol that achieves tolerance to noisy measurements without the need for repeated measurements, so called single-shot error correction.
In this talk I address the problem of simultaneously inferring unknown quantum states and unknown quantum measurements from empirical data. This task goes beyond state tomography because we are not assuming anything about the measurement devices. I am going to talk about the time and sample complexity of the inference of states and measurements, and I am going to talk about the robustness of the minimal Hilbert space dimension. Moreover, I will describe a simple heuristic algorithm (alternating optimization) to fit states and measurements to empirical data.
I will introduce the unitarity, a parameter quantifying the coherence of a channel and show that it is useful for two reasons. First, it can be efficiently estimated via a variant of randomized benchmarking. Second, it captures useful information about the channel, such as the optimal fidelity achievable with unitary corrections and an improved bound on the diamond distance.
I’ll present new approaches to the problems of quantum control and quantum tomography wherein no classical simulation is required. The experiment itself performs the simulation (in situ) and, in a sense, guides itself to the correct solution. The algorithm is iterative and makes use of ideas from stochastic optimization theory.
Inspired by quantum information approaches to thermodynamics, we introduce a general framework for resource theories, from the perspective of subjective agents. First we formalize a way to think of subjective knowledge through what we call specification spaces, where states of knowledge (or specifications) are represented by sets whose elements are the possible states of reality admitted by an observer. We explore how to conciliate different views of reality via embeddings between specification spaces.
For isolated quantum systems fluctuation theorems are commonly derived within the two-time energy measurement approach. In this talk we will discuss recent developments and studies on generalizations of this approach. We will show that concept of fluctuation theorems is not only of thermodynamic relevance, but that it is also of interest in quantum information theory. In a second part we will show that the quantum fluctuation theorem generalizes to PT-symmetric quantum mechanics with unbroken PT-symmetry.
We study the separability of quantum states in bosonic system. Our main tool here is the "separability witnesses", and a connection between "separability witnesses" and a new kind of positivity of matrices--- "Power Positive Matrices" is drawn. Such connection is employed to demonstrate that multi-qubit quantum states with Dicke states being its eigenvectors is separable if and only if two related Hankel matrices are positive semidefinite.
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that BQP is in AWPP, where AWPP is a classical complexity class (known to be included in PP, hence PSPACE). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles.
We introduce a technique for applying quantum expanders in a distributed fashion, and use it to solve two basic questions: testing whether a bipartite quantum state shared by two parties is the maximally entangled state and disproving a generalized area law. In the process these two questions which appear completely unrelated turn out to be two sides of the same coin. Strikingly in both cases a constant amount of resources are used to verify a global property.
The Elitzur-Vaidman bomb tester allows the detection of a photon-triggered bomb with a photon, without setting the bomb off. This seemingly impossible task can be tackled using the quantum Zeno effect. Inspired by the EV bomb tester, we define the notion of "bomb query complexity". This model modifies the standard quantum query model by measuring each query immediately after its application, and ends the algorithm if a 1 is measured.