LPipe - LRIS automated reduction pipeline
LPipe is a fully-automated, end-to-end high-level IDL pipeline for processing of single-object longslit and imaging data taken with the Low Resolution Imaging Spectrograph (LRIS) on Keck I.
The pipeline is designed to produce complete reductions (flux-calibrated 1D spectra and stacked imaging frames) given raw LRIS data and without any user input (in standard cases).
Download complete tarball
Installation guide
Detailed usage instructions
Submitted paper
Reduced data archive
Details:
The pipeline was developed largely for the goal of securing immediate classifications of optical transients discovered by the Palomar Transient Factory while at the telescope (to inform real-time decisions about further follow-up) as well as to uniformly process deep imaging of large numbers of separate fields from a multi-year host-galaxy survey. However, it should be useful for all LRIS users interested in observations of individual sources (multi-slit modes are not yet supported). Frequent users who need to reduce large volumes of survey data quickly, as well as new users looking to reduce data without investing large amounts of time in developing their own tools, may find it useful.
The codebase is all written in IDL, with the exception of an astrometric routine written in python (used for imaging alignment). SWarp and SExtractor are also required for the imaging pipeline.
LPipe takes approximately 15 minutes to process a single night of spectroscopic observations on a standard modern workstation. (Depending on what binning modes were used and the nature of the observations it can be faster or slower.) Processing a full night of imaging takes longer (up to 2 hours).
The code is being constantly developed and improved to improve the quality of the output and robustness of the data processing, so check back on occasion for updates and bug fixes.
Features:
- End-to-end reductions: converts raw FITS files to fully calibrated and stacked images/spectra.
- Very user-friendly (simple installation and one-line reduction).
- All intermediate data products are in a standard, intelligible format. Users desiring custom reduction can still use the pipeline for the early steps (such as bias-subtraction, flat-fielding, and cropping) without running later steps.
- Interruptible - if halted during reduction the program can resume where it left off later.
- Recognizes many common user/telescope errors and data quality issues; excludes or corrects these frames.
- Support for arbitrarily windowed imaging data, and for standard binning modes for spectroscopic data.
- Complete photometric calibration of imaging fields using off-field Landolt/SDSS standards.
- Complete flux-calibration and red+blue connection/scaling for spectroscopic observations.
- Provides refined wavelength solutions using night-sky lines.
- Provides extensive header information / metadata for all files.
- Provides a GUI for precision control of spectroscopic extraction if desired.
Limitations:
- Not yet compatible with any multi-slit modes or with spectropolarimetry.
- Support for historical LRIS-R1 (pre-upgrade) data, and for spectroscopy of sources landing on the secondary CCD, is limited.
- The imaging reduction does not yet solve for distortion parameters - only a basic rectilinear solution is provided. If highly accurate astrometry is important (e.g. for mosaicing), images should be passed to another program for a more refined solution.
- When confronted with spectra of faint sources or source in the presence of complex backgrounds (e.g. faint objects in nearby galaxies or the Galactic plane) the program may extract the wrong object or the background subtraction may be affected by contamination from an unrelated source. (If this happens, there are ways to help guide it, provided in the documentation above.)
- Since no step requires human input or verification, the code can fail or produce highly (or subtly) inaccurate results if confronted with unexpected circumstances, such as erroneous header information or instrument problems. While it can identify (and exclude or compensate for) most forms of bad data such as saturated standards or mixed-up instrument settings, users are encouraged to check each stage of the pipeline to verify expected behavior. Any "anomalous" behavior seen in pipeline results should be particularly vigorously checked.
- For similar reasons, pipeline spectra are unlikely to be as high-quality as what could be obtained from careful user-intensive reduction and extraction, since the program may not always choose correct apertures or determine the wavelength solution as precisely as may be "technically" possible, and extraction has been optimized for faint-object tracing robustness (instead of optimum S/N). Users interested in very weak features, the best-possible sky-line subtraction, or maximum S/N may wish to incorporate other routines.
Required software:
Needed for all tasks:
GSFC IDLastro library
Needed for imaging only:
autoastrometry.py (needed for astrometric calibration)
SExtractor (needed for astrometric/photometric calibration)
SWarp (needed to coadd images)
Feedback:
Please don't hesitate to e-mail any questions or comments to Daniel Perley (d.a.perley[at]ljmu.ac.uk)