To improve the EHT?s sensitivity, resolution and imaging fidelity one might enhance station collecting area, add stations, or increase bandwidth. The latter improvement, bandwidth, is probably the most accessible. For many stations, including Submillimeter Array (SMA), the bandwidth is limited not by the receiver but by the digital signal processing. This talk will describe digital development at SMA. The upgraded SMA will contribute directly to the EHT, and technology under development has potential to benefit other EHT stations as well.
The EHT Wiki is the primary vehicle for communication with the project. The site contains pages on science investigations, algorithmic development, new hardware, and staging information for observations and data processing. This talk will introduce the Wiki and walk through its various sections to show how it is used to help organize the EHT.
You've just finished running your code, and you're certain that you (and only you) know exactly what the region around a supermassive black hole looks like. You could lie back and wait for the accolades to roll in, but why not take an extra moment to make testable predictions that even the observers can understand? Synthetic data can help.
Scattering in the tenuous interstellar plasma blurs the image of Sgr A*. This effect decreases steeply with increasing frequency and becomes subdominant to the intrinsic emission structure at wavelengths close to a millimeter. I will discuss recent work that demonstrates how we can invert the blurring when properties of the scattering are known. With this technique, we can reconstruct the unscattered image of Sgr A* using EHT data. I will also show why some EHT observables -- such as closure phase and fractional polarization -- are largely immune to scattering.
The angular resolution is the most fundamental to imaging the event-horizon-scale structure of supermassive black holes, and in order to realize the highest angular resolution ever, the EHT has been developing a world-wide (sub)mm VLBI array under the international collaborations.In order to boost the imaging capability of the EHT, we have been developing a new imaging method based on the technique so-called "sparse modeling", which allows us to directly solve the ill-posed Fourier-transform equations caused by incomplete sampling of visibilities.We show that the image
Sgr A* regularly flares in the X-ray and near-IR on ~hour timescales, and the EHT has already detected interday variability in 1.3 mm emission on long and short baselines. The addition of highly sensitive long baselines in 2015 will allow for the resolution of time variable structure on sub-minute timescales. This opportunity to observe dynamical process on event horizon scales comes with the challenge of sparse visibility coverage, but several strategies can recover rich information from the limited samples.
Maximizing the science return on the Event Horizon Telescope project requires fitting models for spatially resolved black hole images to the data. These images can be calculated from accretion and jet theory, but theoretical uncertainties lead to systematic errors in the predicted images. In many cases, however, the images are dominated by the combined effects of Doppler beaming and light bending, leading to a characteristic “crescent” shape. I will discuss a geometric crescent model for black hole images based on these effects.
We are now two years into development of hardware and software for phasing ALMA, with emphasis on Band 6 (1.2 mm) but applicability to bands 3 and 7. Central to the effort is software to continuously calculate and apply a phasing solution, and the hardware interface cards (PIC) and other upgrades to the correlator that apply the solution, format the data and extract the correlated data stream. Upstream from the correlator is a hydrogen maser, which is the new ALMA time standard.
Over the next few years the Event Horizon Telescope will greatly expand its capabilities from the current 8 Gbps at a few sites to 64 Gbps at a much larger number of sites. The good news is that this processing can proceed through 4 independent (16 Gbps) processing stages of 2 GHz of bandwidth. The bad news is that EHT stations come in multiple "flavors" each posing its own issues for correlation with DiFX (the current correlation option). This talk will discuss some of the issues and the road ahead.