Closed hbushouse closed 6 years ago
Chris Heller indicates that each exposure segment will contain integration time table entries that cover at least the integrations contained in that segment, and often a few more at either the beginning or end of the table (because the way science data is downlinked in packets is not always broken on boundaries that match the start or end of an integration). So any pipeline task that needs to use the time stamps will first need to query the INTSTART and INTEND keywords to determine which integrations are contained in the segment being worked on and then use the corresponding entries in the integration times table.
I'm closing this for the time being, but it may be necessary to re-open later if keywords or column names need to be changed.
Given that the DMSWG has decided that the INT_TIMES table does NOT need to be propagated beyond level-1b products for exposures taken in non-TSO modes, we don't need to update nearly as many of the data models or steps to propagate it. We only need to propagate it for TSO exposures. So that means we only need to propagate it through the following steps:
rateints
product. The corresponding data model is CubeModel
.calwebb_image2
pass or copy the input CubeModel
as a CubeModel
, without creating a new model type for the output, so the output CubeModel
that is saved to the calints
product should naturally contain the INT_TIMES table from the input rateints
product. Hence no steps or data models need to be updated.calints
output product is the output of the photom
step. The level-2b x1dints
product is the output of the extract_1d
step. Data from all modes enter calwebb_spec2
as a CubeModel
(rateints
product). Most modes retain that form coming out of photom
, because they do not go through the extract_2d
step (MIRI LRS slitless, NIRISS SOSS, NIRCam TSGRISM). The only mode that does go through extract_2d
is NIRSpec BrightObj, so the extract_2d
step needs to be updated to propagate the INT_TIMES table to its output SlitModel
and the SlitModel
schema needs to be updated to define the INT_TIMES table. The extract_1d
step needs to be updated to make use of the INT_TIMES table as input, and propagate the table to the output MultiSpecModel
.calints
products from calwebb_image2
and calwebb_spec2
, which are in the form of CubeModel
or SlitModel
. For imaging, the tso_photometry
step simply needs to be updated to make use of the INT_TIMES table in the input. The table does not get propagated (intact) to the output product. For spectra, the extract_1d
step needs to be updated to make use of the input INT_TIMES table data and propagate it to its output MultiSpecModel
(as already indicated for its use in calwebb_spec2
. The white_light
step needs to be updated to make use of the INT_TIMES data in its input, without doing an explicit (intact) copy of the table to its output.In summary, the following data models need to be updated to define the INT_TIMES table:
SDP is adding a new table extension to all level-1b products that will contain time stamps for each integration within an exposure (see https://jira.stsci.edu/browse/JSDP-387). While primarily intended for use with Time Series exposures, the table will be added to all exposures and is expected to be carried along through all stages of cal pipeline processing.
In order to propagate the table through all stages of pipeline processing we will need to update all relevant science data product data models (e.g. RampModel, ImageModel, SlitModel, etc.) to define this table extension. We will also need to update all calibration steps that create their output(s) as a new datamodel type so that they explicitly copy the integration times table over from the input model to the output model. This includes steps like
ramp_fit
,extract_2d
,cube_build
,resample
, andresample_spec
.Propagating this table through the various pipeline stages is vital in order to make it available to level-3 TSO steps that need it, like
tso_photometry
(#1619),extract_1d
(#1620), andwhite_light
(#1621).This is high priority for Build 7.2, in order to satisfy the TSO processing requirement.