spacetelescope / jwst

Python library for science observations from the James Webb Space Telescope
https://jwst-pipeline.readthedocs.io/en/latest/
Other
570 stars 167 forks source link

cube_build double counts exposure times in total #6303

Closed stscijgbot-jp closed 3 years ago

stscijgbot-jp commented 3 years ago

Issue JP-2263 was created on JIRA by David Law:

Data from LRE5 revealed a bug in how the effective total integration time (EFFEXPTM) of the MIRI MRS data cubes is being calculated. The values being reported in the headers of the cubes is exactly double the correct amount.

It looks like this is because of a bug in ifu_cube.py, which computes this and other keywords around line 2042 via a call to the model blender.  However, model blender is being passed the headers for both the SHORT and LONG detectors at the same time, effectively doubling the exposure time if simply adding up the time of each of the exposures.

Either model blender should only be passed the headers appropriate to a given cube, or the model blender should have some logic to not double-count exposure times across different grating/detector configurations.

This also suggests that some logic would be necessary for the multi-band cube case in which 1A-4C data are all combined into the same cube.  At the moment it looks like the total exposure time here is incorrect by a factor of 6 (2 from detectors, and 3 from bands).  E.g., for a 4-point dither performed for each of the A/B/C settings, with 100 seconds per dither position, the current total time is reported as 410023=2400 seconds.  It should instead be 4100=400 seconds, giving the effective total exposure time at each wavelength in the cube.

stscijgbot-jp commented 3 years ago

Comment by Jane Morrison on JIRA:

Update ifu_cube.py blend_output_metadata to only blend input that cover the IFU cube

stscijgbot-jp commented 3 years ago

Comment by David Law on JIRA:

I tested this with #6360 in the latest master branch- it looks like standard per-band data cubes are now fixed, but multiband cubes (e.g., 1ABC cubes) still have the problem.  We will  need to introduce logic to the model blender to add up only integration time in a given band.  I.e., compute the total integration time of 1A exposures, 1B exposures, 1C, etc and then median those numbers to determine the total integration time to report for the composite cube.  (The median allows for some safety in the unexpected case that there might be fewer exposures in some bands than in others).

stscijgbot-jp commented 3 years ago

Comment by Jane Morrison on JIRA:

Ah I understand the issue now for multi band cubes now. 

stscijgbot-jp commented 3 years ago

Comment by Jane Morrison on JIRA:

David Law Howard Bushouse

We had some discussion of this today in our sprint planning.  Changing how blender works for cube_build for multi band data would  be inconsistent with how the total integration time is being calculated for other (non-spectral) data.  In the pipeline the total integration time is calculated from all the data used to make the output. This gets tricky for spectral data because  we add exposures to fully cover the wavelength region, unlike imager type data were extra exposures are often dithered data. However, take the imager case where we have many dithers but some regions of the mosaic only have a few exposures overlapping, whereas other regions may have many more exposures overlapping. The effective integration time is determined from all the data and not different for regions with different overlapping regions.  Before I make some changes to blender I think we need to pull in a wider audience and find out what we should do in general for spectral data where extra exposures are taken not to increase the depth but to increase the wavelength coverage.  Is the problem that the effective integration time is being used in a data analysis tool ?  

stscijgbot-jp commented 3 years ago

Comment by Jane Morrison on JIRA:

David Law do you agree with the above comment - if so I will close this ticket. If you don't agree I think you might want to take this to the JWST CAL WG group and check how they want the total integration time for spectral data to be given by the blender routine. 

stscijgbot-jp commented 3 years ago

Comment by David Law on JIRA:

Hm, I see your point.  Arguably this is a similar case to a large extended mosaic in imaging mode, but instead of mosaicing in spatial extent we're mosaicing in spectral extent.  In both cases 'total' exposure time is a misleading term- you can have the time required to get all of the data that went into the final product but that's not the effective time at any given location within the product.  Given that analogy its not clear that we should make any further changes at the present time, so I'd say let's close the ticket until such time as there's reason to reassess.

stscijgbot-jp commented 3 years ago

Comment by Jane Morrison on JIRA:

I