This series consists of talks in the area of Foundations of Quantum Theory. Seminar and group meetings will alternate.
Conventional quantum mechanics answers this question by specifying the required mathematical properties of wavefunctions and invoking the Born postulate. The ontological question remains unanswered. There is one exception to this. A variation of the Feynman chessboard model allows a classical stochastic process to assemble a wavefunction, based solely on the geometry of spacetime paths. A direct comparison of how a related process assembles a Probability Density Function reveals both how and why PDFs and wavefunctions differ from the perspective of an underlying kinetic theory.
Many statistics problems involve predicting the joint strategy that will be chosen by the players in a noncooperative game. Conventional game theory predicts that the joint strategy will satisfy an ``equilibrium concept\'\'. The relative probabilities of the joint strategies satisfying the equilibrium concept are not given, and all joint strategies that do not satisfy it are given probability zero. As an alternative, I view the prediction problem as one of statistical inference, where the ``data\'\' includes the details of the noncooperative game.
I show that physical devices that perform observation, prediction, or recollection share an underlying mathematical structure. I call devices with that structure ``inference devices\'\'. I present a set of existence and impossibility results concerning inference devices. These results hold independent of the precise physical laws governing our universe. In a limited sense, the impossibility results establish that Laplace was wrong to claim that even in a classical, non-chaotic universe the future can be unerringly predicted, given sufficient knowledge of the present.
The unparalleled empirical success of quantum theory strongly suggests that it accurately captures fundamental aspects of the workings of the physical world. The clear articulation of these aspects is of inestimable value --- not only for the deeper understanding of quantum theory in itself, but for its further development, particularly for the development of a theory of quantum gravity.
TBA
We start by studying the non-computational geometry of fractionally-dimensioned measure-zero dynamically-invariant subsets of phase space, associated with certain deterministic nonlinear dissipative dynamical systems. Then, by studying the asymptotic states of the Hawking Box, the existence of such invariant subsets is conjectured for gravitationally-bound systems. The argument hinges around the phase-space properties of black holes. Like Penrose, it is assumed that phase-space volumes shrink when the contents of the Hawking Box contain black holes.
A simple theorem of Dirac identifies primary first-class constraints as generators of transformations, \'that do not affect the physical state\'. This result has profound implications for the definition of physical states and observables in the quantization of constrained systems, and leads to one aspect of the infamous \'problem of time\' in quantum gravity. As I will discuss, a close look at the theorem reveals that it depends crucially on the assumption of an absolute time.
After using the complex Hilbert space formalism for quantum theory for so long, it is very easy to begin to take for granted features like projection operators and the projection postulate, the algebra of observables, symmetric transition probabilities, linear evolution, etc.... Over the past 50 years there have been many attempts to gain a better understanding of this formalism by reconstructing it from different kinds of (sometimes) physically motivated assumptions.
Theoretical and experimental results on the Quantum Injected Optical Parametric Amplification (QI-OPA) of optical qubits in the high gain regime (g > 6) are reported. The entanglement of the related Schroedinger Cat-State (SCS) is demonstrated as well as the establishment of Phase-Covariant quantum cloning for a Macrostate consisting of about 106 particles. In addition, the violation of the CHSH inequality is has been realized experimentally.
Domains were introduced in computer science in the late 1960\'s by Dana Scott to provide a semantics for the lambda calculus (the lambda calculus is the basic prototype for a functional programming language i.e. ML). The study of domains with measurements was initiated in the speaker\'s thesis: a domain provides a qualitative view of information expressed in part by an \'information order\' and a measurement on a domain expresses a quantitative view of information with respect to the underlying qualitative aspect.