This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Landauer's erasure principle states that there is an inherent work cost associated with all irreversible operations, like the erasure of the data stored in a system. The necessary work is determined by our uncertainty: the more we know about the system, the less it costs to erase it.
I revisit an example of stronger-than-quantum correlations that was discovered by Ernst Specker in 1960. The example was introduced as a parable wherein an over-protective seer sets a simple prediction task to his daughter's suitors. The challenge cannot be met because the seer asks the suitors for a noncontextual assignment of values but measures a system for which the statistics are inconsistent with such an assignment. I will show how by generalizing these sorts of correlations, one is led naturally to some well-known proofs of nonlocality and contextuality, and to some new ones.
In this talk we quickly review the basics of the modal "toy model" of quantum theory described by Schumacher in his September 22 colloquium at PI. We then consider how the theory addresses more general open systems. Because the modal theory has a more primitive mathematical structure than actual quantum mechanics, it lacks density operators, positive operator measurements, and completely positive maps.
The question of the existence of gravitational stress-energy in general relativity has exercised investigators in the field since the very inception of the theory. Folklore has it that no adequate definition of a localized gravitational stress-energetic quantity can be given. Most arguments to that effect invoke one version or another of the Principle of Equivalence. I argue that not only are such arguments of necessity vague and hand-waving but, worse, are beside the point and do not address the heart of the issue.
The uncertainty principle bounds the uncertainties about the outcomes of two incompatible measurements, such as position and momentum, on a particle. It implies that one cannot predict the outcomes for both possible choices of measurement to arbitrary precision, even if information about the preparation of the particle is available in a classical memory. However, if the particle is prepared entangled with a quantum memory, it is possible to predict the outcomes for both measurement choices precisely. I will explain a recent extension of the uncertainty principle to incorporate this case.
A brief review of some recent work on the causal set approach to quantum gravity. Causal sets are a discretisation of spacetime that allow the symmetries of GR to be preserved in the continuum approximation. One proposed application of causal sets is to use them as the histories in a quantum sum-over-histories, i.e. to construct a quantum theory of spacetime. It is expected by many that quantum gravity will introduce some kind of fuzziness uncertainty and perhaps discreteness into spacetime, and generic effects of this fuzziness are currently being sought.
Quantum states are not observables like in any wave mechanics but co-observables describing the reality as a possible knowledge about the statistics of all quantum events, like quantum jumps, quantum decays, quantum diffusions, quantum trajectories, etc.
Many results have been recently obtained regarding the power of hypothetical closed time-like curves (CTC’s) in quantum computation. Most of them have been derived using Deutsch’s influential model for quantum CTCs [D. Deutsch, Phys. Rev. D 44, 3197 (1991)]. Deutsch’s model demands self-consistency for the time-travelling system, but in the absence of (hypothetical) physical CTCs, it cannot be tested experimentally. In this paper we show how the one-way model of measurement-based quantum computation (MBQC) can be used to test Deutsch’s model for CTCs.
It has long been recognized that there are two distinct laws that go by the name of the Second Law of Thermodynamics. The original says that there can be no process resulting in a net decrease in the total entropy of all bodies involved. A consequence of the kinetic theory of heat is that this law will not be strictly true; statistical fluctuations will result in small spontaneous transfers of heat from a cooler to a warmer body.
We first discuss quantum measure and integration theory. We then consider various anhomomorphic logics. Finally, we present some connections between the two theories. One connection is transferring a quantum measure to a measure on an anhomomorphic logic. Another is the creation of a reality filter that is stronger than Sorkin's preclusivity. This is accomplished by generating a preclusive coevent from a quantum measure. No prior knowledge of quantum measure theory or anhomomorphic logics will be assumed.