Time Domain Astronomy at High Noon
Time domain astronomy (TDA) is, both at once, one of the oldest areas of astronomy and one of the newest. Historically, novae through the use of “speed and peak luminosity’ relation were seen as a way to measure distances, particularly to what we now call as other galaxies. However, the period-luminosity relation for Cepheids turned out to be superior [Leavitt & Pickering 1908] and paved the way to settle the size of the Galaxy [Shapley 1918] and determine the distance to our neighboring galaxy, M31 [Hubble 1925]. Fritz Zwicky is properly credited with starting the field of observational supernovae research which, unlike the study of variable stars, required hurly-burly action on part of astronomers at short notice (let alone, subsequent follow up activity). Zwicky’s achievements in this field, the progression from a simple telescope atop Robinson Laboratory (to search for supernovae in the Virgo cluster) at Caltech to the 48-inch Palomar telescope is well known. What is far less known is Zwicky’s excellent grasp of the foundation of time domain astronomy methodology – pioneering the use of wide-field telescopes, automation, advanced detectors, real-time detection, massively multiplexed spectroscopy (via large objective gratings) event classification (which we now call as Machine Learning – not very different from the Morphological method) what it takes to undertake time domain astronomy. A serious student of the TDA methodology would benefit from reading F. Zwicky’s article in Supernovae and Supernova Remnants [Space Science Library, 45, 1 (1974)].
The second half of the last century was an era of sky surveys. It began with the Palomar Observatory Sky Survey and is still continuing with ground-based facilities (optical, NIR, radio) and space missions (long wavelength IR, X-ray, gamma-rays). At the turn of the century, ambitious optical surveys began undertaking time domain surveys – repeated visits to the same piece of sky. The optical TDA is now the most mature and with some reserve I would say that optical surveys are at “high noon”. Up until about 21st magnitude the optical sky is receiving considerable attention, every usable night. In about two years, the Vera C. Rubin Observatory will push to fainter magnitudes, every usable night.
We should bear in mind that TDA has really three distinct methodologies: transient objects, moving objects and variable objects (stars, AGN). Each methodology has its own demands for cadence, analysis and follow up resources. My primary expertise is transient objects and so the rest of this note will largely focus on this topic. Let me start off the discussion by considering the most common transients in the sky: supernovae. Supposedly there is a supernova every second somewhere in our Universe. The number of candidates observed by ongoing surveys is easily in excess of ten thousand per year. So, the zeroth order question: what fraction of this deluge needs classification (i.e. determine it is a supernova and classify it a gross level, say Ia, II, Ibc). The current global capacity for this exercise is 2,000 per year. This number seems adequate to undertake demographic studies in a robust manner. Next, we need to determine which fraction of these need detailed studies. Such studies necessarily need large telescopes equipped with sensitive spectrometers. The number of identified supernovae is so large that the limiting factor is ability to secure observing time on Gemini, Keck, VLT etc. In fact, as we go to press, my associates received grim verdict from the Gemini Time Allocation Committee. The allocation process is, in the mean, fair, but at every occasion is subject to fluctuation noise. Understandably, my (and other) graduate students and post-doctoral fellows are quite stressed about the inability to count on a steady allocation to pursue their thesis in a deterministic fashion. This situation will only be exacerbated once when the Legacy Survey for Space & Time (LSST) begins routine operation.
Now let me switch to gamma-ray bursts (GRBs) – a field I used to work, some two decades ago. I looked around for a one “stop for all inquiries” and found many that were either specific to a mission or the authors, presumably drenched (hopefully not drowned) by the deluge of GRBs, had stopped updating the websites. So, in absence of sound data, I resorted to modeling and theory.
A useful number is that there are about one thousand GRBs per day that are potentially detectable. Even, if one ignores all the GRBs found before the start of the BeppoSAX revolution (1997), some 33,000 GRBs have occurred. Of this, Swift detects and localizes about 100 per year and Fermi/GBM detects 235 per year. In short, at first blush, one would GRBs is a mature field and progress will be linear. However, what surprised me is the amazing (exponential, practically) growth of meter-class telescopes, many robotic and some with large field-of-view (FoV) that jump in with every GRB localization. This growth in hardware and observing capacity is a harbinger of good news for the forthcoming Chinese-French GRB mission, SVOM (Space Variable Object Monitor) mission.
This brings me to the important topic of rapid follow up. Observers have been acutely aware of the wonderful diagnostics and science returns from rapid follow up. The circum-burst region is a direct clue to the progenitor. For instance, in one of the proposed channels for Ia supernovae (“double-degenerate” model), there is no expectation for a rich circumstellar medium. On the other hand, for another proposed channel (“single-degenerate” model in which a white dwarf accretes matter from a donor star, reaches a critical mass and then explodes), the circum-stellar medium is expected to be rich with a density profile increasingly as the inverse square of distance to the progenitor. Early observations are the key to distinguishing the two models.
It appears to me that there has been a fantastic growth in the global capacity to attend to early time photometric observations, as appears to be the case from a casual reading of GCNs. There are great opportunities for individuals and Observatories to capitalize on the on-going optical TDA revolution by building rapid response spectrographs. ZTF provides alerts within minutes of images obtained at the Palomar 48-inch Schmidt telescope. Sophisticated users and brokers are in the position of arriving at rapid conclusions and increasingly the science returns, in part, will go to those who can respond rapidly to interesting events. This is a good time to not only profit from low-hanging fruits but also to build a system to respond to alerts from LSST.
ZTF Principal Investigator and Professor of Astronomy & Planetary Science at the California Institute of Technology