This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
We first discuss quantum measure and integration theory. We then consider various anhomomorphic logics. Finally, we present some connections between the two theories. One connection is transferring a quantum measure to a measure on an anhomomorphic logic. Another is the creation of a reality filter that is stronger than Sorkin's preclusivity. This is accomplished by generating a preclusive coevent from a quantum measure. No prior knowledge of quantum measure theory or anhomomorphic logics will be assumed.
Non-relativistic quantum mechanics is derived as an example of entropic inference. The basic assumption is that the position of a particle is subject to an irreducible uncertainty of unspecified origin. The corresponding probability distributions constitute a curved statistical manifold. The probability for infinitesimally small changes is obtained from the method of maximum entropy and the concept of time is introduced as a book-keeping device to keep track of how they accumulate. This requires introducing appropriate notions of instant and of duration.
Many putative explanations in physics rely on idealized models of physical systems. These explanations are inconsistent with standard philosophical accounts of explanation. A common view holds that idealizations can underwrite explanation nonetheless, but only when they are what have variously been called Galilean, approximative, traditional or controllable. Controllability is the least vague of these categories, and this paper focuses on the relation between controllability and explanation. Specifically, it argues that the common view is an untenable half-measure.
One might have hoped that philosophers had sorted out what ‘truth’ is supposed to be by now. After all, Aristotle offered what seems to be a clear and simple characterization in his Metaphysics. So perhaps it is surprising (and then again perhaps it isn’t), that contemporary philosophers have not settled on a consensus regarding the nature of truth to this day.
Our starting point is a particular `canvas' aimed to `draw' theories of physics, which has symmetric monoidal categories as its mathematical backbone. With very little structural effort (i.e. in very abstract terms) and in a very short time this categorical quantum mechanics research program has reproduced a surprisingly large fragment of quantum theory. Philosophically speaking, this framework shifts the conceptual focus from `material carriers' such as particles, fields, or other
The Wheeler delayed choice experiment, Elitzur-Vaidman interaction-free measurement, and Hosten-Kwiat counterfactual computation will be discussed to answer Bohr's forbidden question: "Where is a quantum particle while it is inside a Mach-Zehnder Interferometer?". I will argue that the naive application of Wheeler's approach fails to explain a weak trace left by the particle and that the two-state vector description is required.
In the Scholium in Newton's Principia which contains the discussions about absolute space, time, and the bucket experiment, Newton also posed a problem that Julian Barbour has denoted the "Scholium problem". Newton writes there "But how are we to obtain the true motions from their causes, effects, and apparent differences, and the converse, shall be explained more at large in the following treatise. For to this end it was that I composed it". This problem was clearly considered very important by Newton who claims he wrote the Principia dedicated to this problem.
In this talk, I will demonstrate that correlations inconsistent with any locally causal description can be a generic feature of measurements on entangled quantum states. Specifically, spatially-separated parties who perform local measurements on a maximally-entangled state using randomly chosen measurement bases can, with significant probability, generate nonclassical correlations that violate a Bell inequality. For n parties using a Greenberger-Horne-Zeilinger state, this probability of violation rapidly tends to unity as the number of parties increases.
How can we describe a device that takes two unknown operational boxes
and conditionally on some input variable connects them in different
orders? In order to answer this question, I will introduce maps from
transformations to transformations within operational probabilistic
theories with purification, and show their characterisation in terms
of operational circuits. I will then proceed exploring the hierarchy
of maps on maps. A particular family of maps in the hierarchy are the
In this talk I will report on a recent work [arXiv:0908.1583], which investigates general probabilistic theories where every mixed state has a purification, unique up to reversible channels on the purifying system. The purification principle is equivalent to the existence of a reversible realization for every physical process, namely that to the fact that every physical process can be regarded as arising from the reversible interaction of the input system with an environment that is eventually discarded.