9.5 Specific Tools for the Analysis of WFC3

This section describes existing tools and packages that can be used for the analysis of WFC3 data. Some of these tools are distributed as STScI affiliated packages, while others have been developed by STScI scientists for their own scientific projects, but have also been made available to the community. The latter type of software is not directly supported by the WFC3 team; thus users are directed to the software developers for assistance. For our latest software releases, refer to our Software Tools page. 

New tools for WFC3 analysis will be stored in WFC3 Notebooks, which is a child GitHub repository to HST Notebooks. The goal of the repository is to organize and standardize our Jupyter notebooks by providing internal team reviews on documentation and code testing. This repository, aimed at the user community, is maintained and supported through any future software updates by employing continuous integration and continuous delivery (CI/CD). 

9.5.1 wfc3tools

wfc3tools is a Python package containing several WFC3-specific tools. Online documentation for wfc3tools can be found on read the docs at:
http://wfc3tools.readthedocs.io/en/latest/index.html

The package is available on Github and is also distributed on the STScI-maintained Space Telescope Environment (stenv) channel. wfc3tools contains the python wrapper modules that call the calwf3 pipeline executables (whose source code is written in C), as well as other auxiliary functions. The pipeline modules (calwf3wf3ctewf3ccdwf32dwf3rejwf3ir) are described in detail in Section 3.4 of this book, along with an example of calwf3 manual reprocessing in Section 3.5.2. Here we briefly describe the other tools. The boldface paragraph titles correspond to the module name and link to the read-the-docs resources which contain a more detailed documentation.

embedsub
Given an image specified by the user which contains a subarray readout, return a full-frame image with the subarray implanted at the appropriate location.

pstat
Plot statistics for a specified image section up the ramp of an IR MultiAccum image. Sections from any of the SCI, ERR, DQ, image extensions can be plotted. A choice of mean, median, mode, standard deviation, minimum and maximum statistics is available.

pstack
Plot the stack of MultiAccum sample values for a specified pixel in an IR multiaccum image. Pixels from any of the SCI, ERR, DQ, or TIME image extensions can be plotted.

sampinfo
Prints information about a WFC3/IR MultiAccum image, including exposure time information for the individual samples (readouts). The global information listed (and the names of the header keywords from which it is retrieved) includes:

  • the total number of image extensions in the file (NEXTEND)
  • the name of the MultiAccum exposure sample sequence (SAMP_SEQ)
  • the total number of samples, including the “zeroth” read (NSAMP)
  • the total exposure time of the observation (EXPTIME).

sub2full
Given an image specified by the user which contains a subarray readout, return the location of the corner of the subarray in a full frame reference image (including the full physical extent of the chip), in 1-indexed pixels. If the user supplies an X and Y coordinate, then the translated location of that point will be returned.

9.5.2 Point Spread Function Modeling

A new WFC3 notebook, located on the HST notebooks webpage, demonstrates how to generate PSF models for WFC3 observations. The new tool provides users with several workflows depending on their science goals and available data, including 1) downloading empirical library PSF models for high-precision stellar photometry and astrometry; 2) extracting and median stacking stars for characterizing extended PSF wing emission and diffraction spikes; and 3) querying the MAST PSF image library to retrieve and stack stars from archival observations when analyzing sparse fields with very few stars. This tool has the ability to model PSFs for both individual exposures (FLTs/FLCs), as well as drizzled data products (DRCs/DRZs). While the majority of the examples are based on WFC3, the code can also model ACS and WFPC2 observations.

9.5.3 WFC3 Photometry Tools

WFC3 zeropoints          

A Jupyter Notebook (also linked in the WFC3 section of the HST Notebooks repository) shows how to use stsynphot to compute photometric keywords values such as the inverse sensitivity (PHOTFLAM), pivot wavelength (PHOTPLAM) and filter bandwidth (PHOTBW) for any WFC3 'obsmode', which is a combination of 'instrument, detector, filter, date, and aperture'. The tool also computes zeropoint values (STMAG, ABMAG, VEGAMAG) and is especially useful for Vegamag zeropoints which require an input spectrum. The notebook may also be used for determining time-dependent WFC3/UVIS zeropoints for any observation date, as the values given in WFC3 ISR 2021-04 are defined for the June 2009 reference epoch (as of mid-2021, the WFC3/IR zeropoints are not time-dependent). The python code works for both the UVIS and IR detectors, has the capability to loop over multiple filters, and optionally creates and plots the 'total system throughput' tables for each obsmode.

UVIS time-dependent photometry           

A Jupyter Notebook (also linked in the WFC3 section of the HST Notebooks repository) shows how to work the new time-dependent UVIS calibration for observations of which span a range of dates and therefore have different photometric keyword values populated in the image headers. The notebook uses sample images of the standard star GD153 acquired at three epochs and shows how to compute aperture photometry, apply the new time-dependent PHOTFLAM keywords, and plot the corresponding countrates and magnitudes.

The unique zeropoint values must be accounted for prior to combining UVIS observations over multiple epochs with AstroDrizzle, and the notebook shows how to equalize the countrate values in the science array of each input FLC image prior to drizzling.

WFC3 synthetic photometry examples    

A Jupyter Notebook (also linked in the WFC3 section of the HST Notebooks repository) replaces pysynphot examples from the 2018 version of the Data Handbook and demonstrates how to use stsynphot for several use cases:

  • Compute the inverse sensitivity, zeropoint, and encircled energy correction for any WFC3 'obsmode'
  • Renormalize a spectrum to 1 count/sec in a given bandpass and output the predicted magnitude or flux for a different bandpass
  • Determine the color transformation between two bandpasses for a given spectrum
  • Compute color terms for UV filters for a blue versus a red standard star observed on UVIS2.

WFC3 photometric conversion tool      

A Jupyter notebook (also linked in the WFC3 section of the HST Notebooks repository) demonstrates how to calculate photometric transformation coefficients between WFC3/UVIS wide-band filters and any other non-HST filter system for a given object spectrum. The new tool uses the latest WFC3 synthetic throughput tables and replaces functionality provided in the WFC3 Photometric Conversion Tool, which is no longer supported. For more detail on photometric transformations to other systems, see WFC3 ISR 2014-16.

Flux converter tool

A Jupyter notebook (also linked in the HST Notebooks repository) provides a framework for users to convert between multiple magnitude and flux unit systems based on a user-defined input spectrum. This tool is based on the NICMOS unit conversion form and replaces the HST Unit conversion tool, which is no longer supported. The notebook incorporates the latest WFC3/UVIS (WFC3 ISR 2021-04) and WFC3/IR (WFC3 ISR 2020-10) photometric calibration as well as recent changes in the Vega spectrum of up to ~1.5% (Bohlin et al. 2020).

9.5.4 Code for mitigation of variable IR background

Strategies for reprocessing images with variable background are described in WFC3 ISR 2016-16.  Examples include 1) correcting for Helium I atmospheric emission at 1.083 microns and 2) excising reads impacted by scattered light from the bright Earth limb. This ISR provides a full description of the methods and a link to the original python software. This software has since been implemented in WFC3 Jupyter notebooks (linked in the HST Notebooks repository) for reprocessing IR images with variable background, as described below. Additional examples are provided in the Appendix of ISR 2021-01, which describes the reprocessing of archival observations impacted by variable background in order to compute sky flats. For more details, see Section 7.10.

To aid the user in identifying WFC3/IR images affected by variable background, the Jupyter notebook “WFC3/IR IMA Visualization Tools with an Example of Time Variable Background” found on the WFC3 Notebooks GitHub page (also linked in the HST Notebooks repository) provides visualization tools to inspect individual reads and plot the accumulated signal through the exposure. 

Example commands in Section 3.5.2 demonstrate how to diagnose calibrated WFC3/IR images with poor quality ramp fitting due to time-variable background during the exposure. Images are reprocessed using the 'Last-minus-first' technique described in WFC3 ISR 2016-16. This turns off calwf3's ramp fitting step (CRCORR) and treats the IR detector like a CCD that accumulates charge and is read out only at the end of the exposure. A step-by-step walkthrough of this method of reprocessing WFC3/IR images can be found in the Jupyter notebook “Manual Recalibration with calwf3: Turning off the IR Linear Ramp Fit” on the WFC3 Notebooks GitHub page. 

Three other methods for correcting WFC3/IR images affected by variable background can be found in Jupyter notebooks on the WFC3 Notebooks GitHub page. Each notebook illustrates a unique method for correcting the IR images, including:  1) 'flattening' the ramp by subtracting the excess background per read in the notebook “Correcting for Helium Line Emission Background in IR Exposures using the “Flatten-Ramp” Technique”, 2) manually excluding reads impacted by scattered light in the notebook “Correcting for Scattered Light in WFC3/IR Exposures: Manually Subtracting Bad Reads” or 3) masking specific reads in the RAW image and reprocessing with calwf3 in the notebook “Correcting for Scattered Light in WFC3/IR Exposures: Using calwf3 to Mask Bad Reads”.  

While time-variable background also impacts the IR grisms, the methods used for imaging data should not be used to correct G102 and G141 observations which are affected by a combination of Helium I, Zodiacal background, and "Scatter" Earth light, as each of these varies spatially across the detector. More detail on correcting grism data for time-variable background is provided in WFC3 ISR 2017-05 and WFC3 ISR 2020-04.

9.5.5 DASH (Drift-And-SHift) Reduction

Software is available to aid users in properly reducing IR DASH observations. While there are portions of the pipeline that will need to be specialized depending on the specific observation strategy, the general outline of steps should be useful to all users in their reduction. WFC3 ISR 2021-02 walks users through this package and provides an accompanying Jupyter notebook which outlines a strategy based on best practices (Momcheva et. al (2017)).

9.5.6 Grism Reduction Tools

HSTaXe

HSTaXe is a Python/C package for the calibration, extraction, and visualization of spectra from HST slitless spectroscopic observations. HSTaXe replaced aXe (now retired) as the official STScI supported tool for the reduction of slitless grism/prism data from HST. HSTaXe has similar functionalities as aXe but does not require IRAF/PyRAF (no longer supported by STScI) for its functioning. HSTaXe can be obtained from this github repository which also hosts a collection of Jupyter Notebook tutorials showcasing cookbook-style WFC3 grism data reduction workflows. A description of the HSTaXe data reduction workflow including various recommended data preprocessing procedures is presented in WFC3 ISR 2023-07. Please see the WFC3 grism data analysis page for additional and most current information on grism calibration and data analysis.

Slitlessutils 

Slitlessutils is a Python-only package for simulating and extracting slitless spectroscopy data from all active HST grism and prism modes (ACS/WFC; ACS/SBC; WFC3/IR; and WFC3/UVIS). It is planned to succeed HSTaXe as an STScI-supported slitless spectroscopy analysis tool. Slitlessutils implements the LINEAR algorithm (Ryan, Casertano, & Pirzkal 2018) for multi-oriented fields, and the HSTaXe extraction methodology for single-orients. It also includes several utilities to help prepare spectroscopic data for analysis. An early release version of Slitlessutils can be downloaded from GitHub or through pypi, with active development and support for the package ongoing.  Please see the WFC3 grism data analysis page for the most current information on grism calibration and data analysis.

Grizli

Grism redshift & line analysis software for space-based slitless spectroscopy (Grizli) is not an official STScI software, but was developed by WFC3 scientists. It is intended to offer general techniques for manipulating HST slitless spectroscopic observations. Grizli provides software kernels for the end-to-end processing, quantitative and comprehensive modeling, and fitting of WFC3/IR grism data and is available, along with examples and documentation, on github

9.5.7 PandExo: A community tool for transiting exoplanet science with JWST & HST

PandExo is not official WFC3 software, but WFC3 scientists have contributed significantly to its development. Similar to an exposure time calculator (ETC), PandExo is a transiting exoplanet noise simulator which can be used to create simulated observations and estimate realistic transit depth precisions. It is based on Pandeia, the ETC for JWST, and has been expanded to include HST's WFC3 instrument. PandExo_HST can be called and used locally for detailed calculations, but it is also available as a web-based tool in the Exoplanet Characterization Tool Kit (ExoCTK) for one-off calculations. A description of PandExo and how it works can be found in Batalha et al. (2017) and the accompanying github Readme.

9.5.8 Crosstalk correction software

Electronic crosstalk between the UVIS amplifiers during readout induces faint, negative, mirror-symmetric ghost images in the other quadrant of the same CCD chip at ~10-4 levels (see Section 5.5.3 for more details). Standalone software for crosstalk removal (WFC3 ISR 2012-02) can be downloaded from:  http://www.stsci.edu/hst/instrumentation/wfc3/software-tools/crosstalk.

9.5.9 Satellite trail detection

The WFC3 team performs a visual inspection of all images acquired and flags those containing satellite trails (as well as other artifacts). The resulting database is available for download from the WFC3 Performance page; please see the accompanying report (WFC3 ISR 2020-02) for details.

The ACS instrument team has developed a code for finding satellite trails in their data, and flagging the interested pixels’ DQ extension accordingly. The software is part of the acstools package with detailed documentation at:  http://acstools.readthedocs.io/en/latest/satdet.html.

The ACS instrument team is also developing upgrades to this package (see, e.g., ACS ISR 2022-08), which will be included in future versions of acstools. Flagging such trails can be useful e.g., in combining multiple images with Astrodrizzle

Disclaimer: the software is developed and tested only for ACS data, but should work on WFC3 data as well.

9.5.10 IDL procedures for simulating trajectories of multi-lined spatial scans

The WFC3 ISR 2017-06 describes simulations of spatial scans using a simple physical model of HST motions during rarely-used multi-lined spatial scans (single-line scans are much more common, e.g. for time-series spectrophotometry of exoplanet transits). The document contains an IDL code in the appendix that can be used both in designing multi-lined spatial scans, as well as for analyzing existing ones (e.g. PID 16983).