This event provides an opportunity for top young physicists to enjoy a multidisciplinary conference, and interact with resident scientists. In addition, participants will have an opportunity to learn more about Perimeter Institute.
Inferring a quantum system\'s state, from repeated measurements, is critical for verifying theories and designing quantum hardware. It\'s also surprisingly easy to do wrong, as illustrated by maximum likelihood estimation (MLE), the current state of the art. I\'ll explain why MLE yields unreliable and rank-deficient estimates, why you shouldn\'t be a quantum frequentist, and why we need a different approach. I\'ll show how operational divergences -- well-motivated metrics designed to evaluate estimates -- follow from quantum strictly proper scoring rules.
In stochastic treatments of the ERRB set-up, it is equivalent to impose Bell\'s inequalities, a local causality condition, or a certain \"non-contextual hidden variables\" condition. But these conditions are violated by quantum mechanics. On the other hand, it is possible to view quantum mechanics as part of \"quantum measure theory\", a generalization of probability measure theory that allows pair wise interferences between histories whilst banning higher order interference. In this setting, is may be possible find quantum analogues of the three stochastic conditions.
Quantum information theory has two equivalent mathematical conjectures concerning quantum channels, which are also equivalent to other important conjectures concerning the entanglement. In this talk I explain these conjectures and introduce recent results.
It is a fundamental property of quantum mechanics that non-orthogonal pure states cannot be distinguished with certainty, which leads to the following problem: Given a state picked at random from some ensemble, what is the maximum probability of success of determining which state we actually have? I will discuss two recently obtained analytic lower bounds on this optimal probability. An interesting case to which these bounds can be applied is that of ensembles consisting of states that are themselves picked at random.
The Everett (many-worlds) interpretation has made great progress over the past 20-30 years, largely due to the role of decoherence in providing a solution to the preferred basis problem. This makes it a serious candidate for a realist solution to the measurement problem. A remaining objection to the Everett interpretation (and one that is often considered fatal) is that that interpretation cannot make adequate sense of quantum probabilities.
Hints from quantum gravity suggest that a preferred frame may actually exist. One way to accommodate such a frame in general relativity without sacrificing diffeomorphism invariance is to couple the metric to a dynamical, time like, unit-norm vector field--the "aether". I will discuss properties and observational tests of a class of such theories, including post-Newtonian effects and radiation from binary pulsar systems.