Computational Methods at Perimeter

COVID-19 information for PI Residents and Visitors

Conference Date: 
Thursday, February 9, 2012 (All day)
Scientific Areas: 
Other

 

Computational methods are a tool that is used in virtually all areas of science. This one-day workshop aims to entice Perimeter researchers from different areas to discuss the computational tools they use in their everyday work -- even when research fields differ, computational methods are often similar. At Perimeter, researchers employ a wide range of algorithms ranging from traditional number-crunching to novel graph processing methods, and these may be implemented either in large, collaborative codes, or in exploratory Maple/Mathematica/Python scripts.

 

We invite participants from all areas of research at Perimeter. Presentations should aim to give an overview of the computational methods they employ, without requiring in-depth knowledge of a particular field. This workshop also aims to create a user community for the new computing hardware available at Perimeter.

 

 

Marcus Appleby, Perimeter Institute

Avery Broderick, Perimeter Institute

Lukasz Cincio, Perimeter Institute

Bianca Dittrich, Perimeter Institute

Chad Hanna, Perimeter Institute

Matt Johnson, Perimeter Institute

David Rideout, UCSD

Erik Schnetter, Perimeter Institute

Rafael Sorkin, Perimeter Institute

Pedro Vieira, Perimeter Institute

Itay Yavin, Perimeter Institute

 

Marcus Appleby, Perimeter Institute

Enrico Barausse, University of Guelph

Oliver Buerschaper, Perimeter Institute

Lukasz Cincio, Perimeter Institute

Bianca Dittrich, Perimeter Institute

Nikolay Gromov, Kings College London

Chad Hanna, Perimeter Institute

Matt Johnson, Perimeter Institute

Luis Lehner, Perimeter Institute

Steve Liebling, Long Island University

Mozhgan Mir, Perimeter Institute

Rob Myers, Perimeter Institute

Marcelo Ponce, University of Guelph

Sohrab Rahvar, Perimeter Institute

Sayeh Rajabi, Perimeter Institute

David Rideout, UCSD

Erik Schnetter, Perimeter Institute

Rafael Sorkin, Perimeter Institute

Pedro Vieira, Perimeter Institute

Guifre Vidal, Perimeter Institute

Itay Yavin, Perimeter Institute

Miguel Zilhao, Universidade do Porto

 

Marcus Appleby

Galois calculations using Magma

 

Avery Broderick

The Tyranny of Scale: Making Simple Problems Hard

Large-scale computational resources have made complicated, non-linear PDEs tractable, as evidenced by their application in hydrodynamic, MHD, plasma, and even GR simulations.  However, these facilities are also proving crucial in solving deceptively complicated systems of coupled ODEs as well.  I will discuss the challenges posed by radiative transfer in accreting, black hole environments and how some of these are solved via a combination of algorithms, tricks, hacks, and lots of CPUs.

 

Lukasz Cincio

Tensor network algorithms

Tensor network algorithms provide highly competitive tools for analyzing ground state properties of quantum lattice models in one and two spatial dimensions. The most notable examples involve matrix product states, projected entangled pair states and multiscale entanglement renormalization ansatz. The key underlying idea of all the approaches is to decompose a quantum many-body state into a carefully chosen network of tensors.

I will present techniques of efficient manipulation of tensor networks and show how to make use of clusters with shared as well as distributed memory architecture. I am going to demonstrate how automatic code generators can be used to reduce the efforts connected with coding tensor networks algorithms.

 

Bianca Dittrich

Coarse graining spin nets with tensor networks

I will shortly describe the challenges to extract large scale behavior from spin foams, which are candidate models for quantum gravity, and motivate the space of simplified models we will consider in this talk. Then I will describe a coarse graining method using a tensor network algorithm for these models and shortly describe the results.

 

Chad Hanna

Computational methods: (Now with data)

In this informal presentation I will describe a few concepts and best practices that I have encountered working with large data sets and analyses.  I will point out a few tools that have helped me get organized and accomplish my goals.

 

Matt Johnson

Feature-finding in the cosmic microwave background

A number of theoretically well-motivated extensions of the standard cosmological model predict detectable secondary signals in the CMB. One example is the eternal inflation scenario, which in many cases predicts a set of features due to cosmic bubble collisions. Other examples include theories with topological defects such as cosmic strings and textures, which also predict a set of features. The most unambiguous way to test such extensions of the standard cosmological model is to utilize the most general predictions for the population of sources on the full sky, and determine the posterior probability distribution over the global parameters defining the theory (such as the total number of features expected, their intrinsic amplitude, etc.). The enormous size of modern CMB datasets, such as those obtained by the Wilkinson Microwave Anisotropy Probe (WMAP) and those currently being obtained by the Planck satellite, provide a unique challenge for such an analysis. In this talk, I will present a strategy to overcome these limitations, and discuss the implementation of a Bayesian source detection algorithm that is currently being used to test such theories.

 

David Rideout

Frameworks for Scientific Computing

A computational framework is software which facilitates the interoperability of modules.  Modules are pieces of code which perform some specific task, without having to know any internal details about the other modules with which it will operate.  The purpose of a framework is to transform the software development process from writing a monolithic code in a specific programming language, such as C++, Python, or Mathematica, into the writing of a collection of modules.  The modules can be written in a variety of languages, often by different people.  The result of the software development process is a large collection of modules, often called a community toolkit, which enable a wide spectrum of computations within a given scientific field. I will describe this process with regard to the Cactus framework, and illustrate with examples from Loop Quantum Gravity, Causal Sets, and Coastal Modeling.

 

Erik Schnetter

Computational Relativistic Astrophysics

Solving the Einstein equations and the relativistic magneto-hydrodynamics equations numerically requires a surprising amount of mathematical preparation, a staggering amount of software engineering, and a healthy dose of raw computing power. With this extreme set of equations in mind, I will show some of the tricks and tools that are used in this field. I will also try to generalize to how I may approach formulating and discretizing a new PDE-based physics problem where no good computational methods are known yet.

 

Rafael Sorkin

A Lisp "scratch-pad" for working with posets

Causal-set theory postulates that the deep structure of spacetime is that of a discrete partially ordered set or "poset".  But unlike a rectangular grid or even a simplicial complex, a poset does not in general break up into "local components".  The Lisp programming language provides data types sufficiently flexible to handle such objects, including posets whose structure changes during the course of a simulation.  Thus one can represent an element of the poset as a lisp `symbol' and the element's past as the "value" of the symbol.  I have written a suite of lisp functions and macros using this representation, and which makes possible a large range of causal set simulations.  The library is available for use by anyone.

 

Pedro Vieira

Mathematica and Symbolic Computations

 

Itay Yavin

Computing in High Energy Physics

I will give a short survey of the type of computations and computer softwares used in high energy physics. I will start from computations of the most basic scattering cross-sections and end with a mention of simulations of the complex interactions of fundamental particles as they pass through the detector. If time permits I will mention our own efforts with RECAST as a framework where these different softwares come together to allow for a powerful recreation of searches in HEP.