This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Many results have been recently obtained regarding the power of hypothetical closed time-like curves (CTC’s) in quantum computation. Most of them have been derived using Deutsch’s influential model for quantum CTCs [D. Deutsch, Phys. Rev. D 44, 3197 (1991)]. Deutsch’s model demands self-consistency for the time-travelling system, but in the absence of (hypothetical) physical CTCs, it cannot be tested experimentally. In this paper we show how the one-way model of measurement-based quantum computation (MBQC) can be used to test Deutsch’s model for CTCs.
It has long been recognized that there are two distinct laws that go by the name of the Second Law of Thermodynamics. The original says that there can be no process resulting in a net decrease in the total entropy of all bodies involved. A consequence of the kinetic theory of heat is that this law will not be strictly true; statistical fluctuations will result in small spontaneous transfers of heat from a cooler to a warmer body.
We first discuss quantum measure and integration theory. We then consider various anhomomorphic logics. Finally, we present some connections between the two theories. One connection is transferring a quantum measure to a measure on an anhomomorphic logic. Another is the creation of a reality filter that is stronger than Sorkin's preclusivity. This is accomplished by generating a preclusive coevent from a quantum measure. No prior knowledge of quantum measure theory or anhomomorphic logics will be assumed.
Non-relativistic quantum mechanics is derived as an example of entropic inference. The basic assumption is that the position of a particle is subject to an irreducible uncertainty of unspecified origin. The corresponding probability distributions constitute a curved statistical manifold. The probability for infinitesimally small changes is obtained from the method of maximum entropy and the concept of time is introduced as a book-keeping device to keep track of how they accumulate. This requires introducing appropriate notions of instant and of duration.
Many putative explanations in physics rely on idealized models of physical systems. These explanations are inconsistent with standard philosophical accounts of explanation. A common view holds that idealizations can underwrite explanation nonetheless, but only when they are what have variously been called Galilean, approximative, traditional or controllable. Controllability is the least vague of these categories, and this paper focuses on the relation between controllability and explanation. Specifically, it argues that the common view is an untenable half-measure.
One might have hoped that philosophers had sorted out what ‘truth’ is supposed to be by now. After all, Aristotle offered what seems to be a clear and simple characterization in his Metaphysics. So perhaps it is surprising (and then again perhaps it isn’t), that contemporary philosophers have not settled on a consensus regarding the nature of truth to this day.
Our starting point is a particular `canvas' aimed to `draw' theories of physics, which has symmetric monoidal categories as its mathematical backbone. With very little structural effort (i.e. in very abstract terms) and in a very short time this categorical quantum mechanics research program has reproduced a surprisingly large fragment of quantum theory. Philosophically speaking, this framework shifts the conceptual focus from `material carriers' such as particles, fields, or other
The Wheeler delayed choice experiment, Elitzur-Vaidman interaction-free measurement, and Hosten-Kwiat counterfactual computation will be discussed to answer Bohr's forbidden question: "Where is a quantum particle while it is inside a Mach-Zehnder Interferometer?". I will argue that the naive application of Wheeler's approach fails to explain a weak trace left by the particle and that the two-state vector description is required.
In the Scholium in Newton's Principia which contains the discussions about absolute space, time, and the bucket experiment, Newton also posed a problem that Julian Barbour has denoted the "Scholium problem". Newton writes there "But how are we to obtain the true motions from their causes, effects, and apparent differences, and the converse, shall be explained more at large in the following treatise. For to this end it was that I composed it". This problem was clearly considered very important by Newton who claims he wrote the Principia dedicated to this problem.
In this talk, I will demonstrate that correlations inconsistent with any locally causal description can be a generic feature of measurements on entangled quantum states. Specifically, spatially-separated parties who perform local measurements on a maximally-entangled state using randomly chosen measurement bases can, with significant probability, generate nonclassical correlations that violate a Bell inequality. For n parties using a Greenberger-Horne-Zeilinger state, this probability of violation rapidly tends to unity as the number of parties increases.