This series consists of weekly discussion sessions on foundations of quantum Theory and quantum information theory. The sessions start with an informal exposition of an interesting topic, research result or important question in the field. Everyone is strongly encouraged to participate with questions and comments.
In recent years the characterization of many-body ground states via the entanglement of their wave-function has attracted a lot of attention. One useful measure of entanglement is provided by the entanglement entropy S.
Are Quantum Mechanics and Special Relativity unrelated theories? Is Quantum Field Theory an additional theoretical layer over them? Where the quantization rules and the Plank constant come from? All these questions can find answer in the computational paradigm: "the universe is a huge quantum computer".
Dualities appear in nearly all disciplines of physics and play a central role in statistical mechanics and field theory. I will discuss in a pedagogical way our recent findings motivated by a quest for a simple unifying framework for the detection and treatment of dualities.
A recent breakthrough in quantum computing has been the realization that quantum computation can proceed solely through single-qubit measurements on an appropriate quantum state. One exciting prospect is that the ground or low-temperature thermal state of an interacting quantum many-body system can serve as such a resource state for quantum computation. The system would simply need to be cooled sufficiently and then subjected to local measurements.
Adiabatic quantum optimization has attracted a lot of attention because small scale simulations gave hope that it would allow to solve NP-complete problems efficiently. Later, negative results proved the existence of specifically designed hard instances where adiabatic optimization requires exponential time. In spite of this, there was still hope that this would not happen for random instances of NP-complete problems.
At NIST we are engaged in an experiment whose goal is to create superpositions of optical coherent states (such superpositions are sometimes called "Schroedinger cat" states). We use homodyne detection to measure the light, and we apply maximum likelihood quantum state tomography to the homodyne data to estimate the state that we have created.
We review situations under which standard quantum adiabatic conditions fail. We reformulate the problem of adiabatic evolution as the problem of Hamiltonian eigenpath traversal, and give cost bounds in terms of the length of the eigenpath and the minimum energy gap of the Hamiltonians. We introduce a randomized evolution method that can be used to traverse the eigenpath and show that a standard adiabatic condition is recovered. We then describe more efficient methods for the same task and show that their implementation complexity is close to optimal.
This talk will present an overview of work done in the past decade on quantum state and process tomography, describing the basic notions at an introductory level, and arguing for a pragmatic approach for data reconstruction. The latest results include recent numerical comparison of different reconstruction techniques, aimed at answering the question: "is 'the best' the enemy of 'good enough'?"
A fully general strong converse for channel coding states that when the rate of sending classical information exceeds the capacity of a quantum channel, the probability of correctly decoding goes to zero exponentially in the number of channel uses, even when we allow code states which are entangled across several uses of the channel. Such a statement was previously only known for classical channels and the quantum identity channel.