This series consists of talks in the area of Quantum Gravity.
I comment on rather significant recent developments that are relevant for proposals I had presented in previous PI seminars. The Fermi/GLAST space telescope has reported observations that would naturally fit previous formalizations of Planck-scale-induced in-vacuo dispersion (but also quite a few other things). And the unexplained excess noise found at the GEO600 interferometer is just of the type that had been previously described in terms of phenomenological models of spacetime foam (but may well be caused by quite a few other things).
We derive geometric correlation functions in the new spinfoam model with coherent states techniques, making connection with quantum Regge calculus and perturbative quantum gravity. In particular we recover the expected scaling with distance for all components of the propagator. We expect the same technique to be well-suited for other spinfoam models.
In this talk I will review how ideas borrowed from perturbative Quantum Gravity and Effective Field Theory (EFT) in Particle Physics can be applied to problems in General Relativity (GR), such as calculating gravitational wave emission by inspiralling spinning binary systems, including finite size effects and absorption. I will discuss in somewhat more detail how to account for dissipative effects, where the GR/EFT duality is used to predict the power loss due to absorption in the dynamics of binary spinning Black Holes.
In this talk I will describe a topos formulation of consistent histories obtained using the topos reformulation of standard quantum mechanics put forward by Doering and Isham. Such a reformulation leads to a novel type of logic with which to represent propositions. In the first part of the talk I will introduce the topos reformulation of quantum mechanics. I will then explain how such a reformulation can be extended so as to include temporally-ordered collection of propositions as opposed to single time propositions.
The handling of the constraints on initial data is a major issue in most canonical formulations of general relativity. Since the 1960s unconstrained initial data for GR that living on null hypersurfaces has been known, but no canoncial formulation based on these data was developed due to conceptual and technical difficulties. I will explain how these dificulties have been overcome and outline the resulting canonical framework.
Using 2-dimensional CGHS black holes, I will argue that information is not lost in the Hawking evaporation because the quantum space-time is significantly larger than the classical one. I will begin with a discussion of the conceptual underpinnings of problem and then introduce a general, non-perturbative framework to describe quantum CGHS black holes. I will show that the Hawking effect emerges from it in the first approximation.
As well known, cosmic ray experiments can put strong constraints on possible Lorentz Invariance Violations. In particular, the presence of the so called GZK \'cut-off\' may indicate that protons do propagate in the Universe as expected from relativistic invariance. The presence of this feature in the spectrum has been convincingly indicated by the HiRes and Auger experiments, while the Auger Observatory has given indication on the correlation of Ultra High Energy Cosmic particles with nearby sources, as predicted by the GZK feature.
Motivated by the analogy proposed by Witten between Chern-Simons theories and CFT-Wess-Zumino-Witten models, we explore a new way of computing the entropy of a black hole starting from the isolated horizon framework in Loop Quantum Gravity. The results seem to indicate that this analogy can work in this particular case. This could be a good starting point for the search of a deeper connection between the description of black holes in LQG and a conformal field theory.
I discuss a class of compact objects (\'monsters\') with more entropy than a black hole of the same ADM mass. Such objects are problematic for AdS/CFT duality and the conventional interpretation of black hole entropy as counting of microstates. Nevertheless, monster initial data can be constructed in semi-classical general relativity without requiring large curvatures or energy densities.