spacetelescope / jwst

Python library for science observations from the James Webb Space Telescope
https://jwst-pipeline.readthedocs.io/en/latest/
Other
558 stars 164 forks source link

Cycling Bias for Cycling Fast-Frame Resets #7163

Open stscijgbot-jp opened 1 year ago

stscijgbot-jp commented 1 year ago

Issue JP-2601 was created on JIRA by Everett Schlawin:

NIRCam has fast frame resets for certain subarray modes that cycle through the full 2048x2048 frame in chunks. This creates a cyclical bias pattern that changes with the integration number in an exposure when the number if integrations is >1 and the number of rows per fast frame reset (NFF_ROW_RESET) is less than 2048. Using a single bias file for all integrations results in undesirable cyclical patterns in the bias-subtracted data that can affect reference pixel correction, non-linearity, etc. NIRCam will deliver 2 different bias files.

The bias subtraction step would switch between two different bias values in a cyclical pattern depending on the integration number. For example, the pattern would like like A, B, B, B, A, B, B, B for NFF_ROW_RESET=512 used with the SUBGRISM128 subarrary. Note: discussions are ongoing to finalize whether this can be fully addressed by delivering updated bias reference files for these specific modes, or whether code changes might also be needed. This description will be updated once this discussion is resolved.**

This effect has also been noted in OTE data where a periodic bias offset of ~22 seconds was observed resulting in systematic photometry variation. The period corresponded to the passage of the row reset through the science aperture.

The value for NFF_ROWS_RESET for all NIRCam subarrays is provided at https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam-instrumentation/nircam-detector-overview/nircam-detector-subarrays Any value other than 2048 will result in this bias offset as the row reset overlaps the science aperture.  

Copied some information from an email from Karl Misselt that explains it better

we (NIRCam) are going to deliver 2 bias files for subarrays with NFF_ROW_RESET < 2048. This is due to the known bias disturbance when the row reset resets the block of rows that contain the science subarray.  The row reset pointer gets reset to row 0 at exposure boundaries, but will subsequently index up the array by NFF_ROW_RESET until the next exposure boundary.  So every integration in the series for which (INT #) mod (2048/NFF_ROW_RESET) == 0 will have a different bias than the other ints. For example, in SUBGRISM128 NFF_ROW_RESET = 512, so integrations (0,4,8,...) will get the RESET bias and the others would get the 'normal' bias.  It's pretty straightfoward to implement in code and I'll deliver the relevant bias files once we have all our subarray darks and I've chunked through them.  We intentionally set as many subarrays as feasible (so that row reset overhead was not a substantial fraction of a frame time) to 2048 to avoid this phenomena, but for subarrays with (nx,ny) < 320 and SUBRGRISM < 256, NFF_ROW_RESET is smaller and will be impacted by this.

 

stscijgbot-jp commented 1 year ago

Comment by Everett Schlawin on JIRA:

Some flight data that would be useful for testing this is 

PID 1076 observation 9, 10, 12, 14

 

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

Note that discussions are ongoing to finalize whether this can be fully addressed by delivering updated bias reference files for these specific modes, or whether code changes might also be needed. This ticket will be updated once this discussion is resolved.

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

Important to note that this is per integration within an exposure. Some code change would then be needed inside calwebb_detector1, specific to NIRCam, for these particular subarrays.

In terms of the implementation details, it might still be achieved with a single bias reference file (probably different for each affected subarray) where the reffile would contain several bias frames, one for each integration. (MIRI does this with their darks, where they have one reference file containing multiple dark cubes, different for each integration) -- so the question for NIRCam would be whether to make a special multi-integration bias file, just for these modes, or instead try to track two separate bias files, which Alicia (and Howard) could think about, also adding Bryan here.

stscijgbot-jp commented 1 year ago

Comment by Howard Bushouse on JIRA:

The only way the Cal pipeline superbias step could "cycle" between multiple bias ref files would be to have multiple superbias reference file types defined in CRDS (e.g. superbias1, superbias2, ..., superbiasn), because the fetching of reference files from CRDS can only return one match at a time for a given reference file type (i.e. there's no way to return multiple ref files when just searching for type "superbias").

So the only practical way to accomplish this is to do as Anton Koekemoer mentioned above, which is to have multiple bias images stored within a single bias reference file. This could be done by having different bias images stored in separate extensions of the reference file, or (probably more practical) is to just stack them into a 3D cube in a single extension. The Cal pipeline code would still need some modification to have it apply the appropriate plane from that reference cube to each integration within a given exposure.

Will there be a one-to-one relationship between the integration number in the science exposure and the plane of the 3D bias cube? Or will it be something more like MIRI darks (and other corrections), where there's only integration-specific reference data for the first n integrations in an exposure and everything after that uses the same reference data?

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

(&#^(&## JIRA. Keeps eating 'reply' and starting a new comment. Anyway.

In response to last para. of Howards comment @ 10/May/22 2:42 PM. In theory it will be a one-to-one mapping of integration to plane; though it still needs to be characterized for all the affected subarrays. But the root cause would imply a simple, deterministic mapping. It would be something like

if (scienceIntegration % (2048/NFF_ROW_RESET) == 0) biasPlane = 0; else biasPlane = 1;

So for example, in SUBGRISM128 NFF_ROW_RESET = 512, so integration 0,4,8,.... would get biasPlane=0, the rest biasPlane=1

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

Thanks Howard Bushouse, it sounds like we're converging in having a single bias reference file with multiple biases in it, where there would presumably be a different such reference file for each of these subarray modes.

Eg, in the example Everett Schlawin  gave above for the SUBGRISM128 mode (which has NFF_ROW_RESET=512) a single "cycle" consists of, eg, A, B, B, B (where A and B here correspond to the two different bias images) and this would then repeat if the observer decides to add additional integrations (or would be cut short if they don't do the full cycle).

I'm guessing the reference file might only need to contain two "planes", and the code woudl then figure out how many times to repeat the second one, based on the logic presented by Karl Misselt in his original email which I copy here:

The row reset pointer gets reset to row 0 at exposure boundaries, but will subsequently index up the array by NFF_ROW_RESET until the next exposure boundary. So every integration in the series for which (INT #) mod (2048/NFF_ROW_RESET) == 0 will have a different bias than the other ints. For example, in SUBGRISM128 NFF_ROW_RESET = 512, so integrations (0,4,8,...) will get the RESET bias and the others would get the 'normal' bias.

Question for Everett Schlawin  and Karl Misselt  - could you give a list please of the names of the subarrays that are affected by this (each of which would presumably need its own reference file) just so we can have a sense of the scope?

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

...and Karl Misselt  basically just said the above more succinctly, and was quicker at pressing "send".

The logic for selecting according to "if (scienceIntegration % (2048/NFF_ROW_RESET) == 0) biasPlane = 0; else biasPlane = 1;" is what would need to be done within the pipeline code.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

In all cases, it would be 2-planes in the ref file, one for when the row reset block includes the science subarray and one for when it doesn't

Affected subarrays follow: so maybe 6 cases. 10 if we process the TA data the same way.

Imaging SUB160 - NFF_ROWS_RESET=512 (cycle period 4) - this one is more complicated for the LW since the reset block bisects the science subarray . SUB64P - NFF_ROWS_RESET=256 (cycle period 8)

Coron SUB128 - NFF_ROWS_RESET=512 (cycle period 4) (TA only, probably don't need to implement) SUB64 - NFF_ROWS_RESET=256 (cycle period 8) (TA only, probably don't need to implement)

Time-Series Imaging SUB160P - NFF_ROWS_RESET=512 (cycle period 4) SUB64P - NFF_ROWS_RESET=256 (cycle period 8) SUB32TATS - NFF_ROWS_RESET=256 (cycle period 8) (TA only, probably don't need to implement)

GRISM Time-Series SUBGRISM128 - NFF_ROWS_RESET=512 (cycle period 4) SUBGRISM64 - NFF_ROWS_RESET=512 (cycle period 4) SUB32TATSGRISM - NFF_ROWS_RESET=256 (cycle period 8) (TA only, probably don't need to implement)

stscijgbot-jp commented 1 year ago

Comment by Bryan Hilbert on JIRA:

Karl Misselt has pointed out in an email that for the SUBGRISM128 NRCA1 subarray, it's more difficult to predict at what integration within the exposure the cycle starts (e.g. it is not always integrations 4, 8, 12 that would require biasPlane=0. Sometimes it will be integrations 2, 6, 10, or 3, 7, 11, etc). But he pointed out that the beginning of the cycle can be found by looking at the mean bias level. Would it be possible to add code to the pipeline to use the reference pixels to calculate the mean bias level in each integration and from that, determine which superbias extension to use for each?

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

I think it would have to be something like this as I don't see a deterministic way to figure out which bias to use; once the cycle starts it's deterministic, but there's ambiguity in the first cycle of integrations. THis is true for all the subarrays I've looked at thus far. Documentation would imply that the block row pointer should reset to row 0 at exposure start (that always seems to happen), and then cycle through blocks until it resets the full array and start back at 0 and continue to cycle as long as you are sending integrations within a single exposure. However, it seems that it indexes to row zero at the start of exposure, but then returns to where it was in the cycle prior to being interrupted by the exposure command and then continue the cycle from there. If that's the behavior, there's no way that I've figured out yet to deterministically decide which bias to apply until you've established that 0 index by looking at the bias level in the first N integrations (where N <= Integrations_in_cycle). I'm looking at the micro code (and pinging those more expert than I on the microcode) to see what the actual code is, but empirically, there seems to be this ambiguity, contrary to what I expect from the documentation.

stscijgbot-jp commented 1 year ago

Comment by Bryan Hilbert on JIRA:

Just adding a note here to say that Karl Misselt has produced all of the superbias files in the current superbias reference file format. So we now have the data to test this correction. The only update that may have to be made would be to combine the superbias images for a given detector/subarray into a single file (the high and low superbias images are currently in 2 separate files).

stscijgbot-jp commented 1 year ago

Comment by Howard Bushouse on JIRA:

Yes, it's necessary to combine the reference data used for a given detector/subarray into a single reference file, because there's no way that the Cal pipeline can ask CRDS to return multiple ref files to be used for a given step. Each of the superbias images could be stored in their own tuple of SCI, ERR, DQ image extensions in the ref file. Although we might need to modify the superbias datamodel to accommodate that. Need to think about the optimal way to store them, with appropriate EXTNAME's, etc.

stscijgbot-jp commented 1 year ago

Comment by Bryan Hilbert on JIRA:

Do you want me to put together some multi-extension superbias files for developers to play with, or should I wait until they have agreed on a file format? Or wait until there's a new datamodel for creating the files?

stscijgbot-jp commented 1 year ago

Comment by Howard Bushouse on JIRA:

Let us do some more thinking about how best to package these, hopefully in a way that'll make the existing (simple) superbias ref file backwards compatible. We hope to work on this project in one of our next sprints.

stscijgbot-jp commented 1 year ago

Comment by Howard Bushouse on JIRA:

Karl Misselt any luck/progress on coming up with a way to examine the bias level in the ref pixels in each integration to determine in which integration the cycle starts?

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

Howard Bushouse - not really. It's one of those in principle very straightforward, in practice less so. Have to account for subarrays without reference pixels, should it be more complicated - like a preliminary fit to get a zero point, especially in the case where one doesn't have reference pixels, so would need to mitigate against sources/structures , can one simply do 'relative' levels within a sequence or do you need a per subarray table of high/low bias levels? If it's single integration just use bias0 (that will always be correct), but I'm not sure how dithers vs. multi-int sequences work at the commanding level... Anyway, a bunch of crap floating around in my head, some of it actually related to this issue, so I'm still sort on in the weeds/conceptual stage of thinking about how to approach it.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

I am sort of leaning towards 'withdrawing' this, certainly putting it on the back burner. The low bias doesn't seem to be a global offset, but rather the bulk of pixels are in family with the 'higher' bias; the histogram on low bias seems centered on the same value, but is much fatter and has a very significant plateau towards low values. If that's repeatable on a pixel-by-pixel basis, a separate bias file would still be optimal, but it makes determining which bias file to select for a give integration very challenging. So maybe not 'withdraw', but back burner it. I will deliver a single bias file for now and continue to look into how to deterministically identify how to separate out bias parameters from the raw data.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

I've uploaded [^lohi_bias_stuff.pdf] a write up on a potential algorithm to be used to identify the correct bias frame to apply to a given data set.

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

thanks very much Karl Misselt , this looks very helpful. If you like, I'll add it as an agenda item for the upcoming CalWG (Tue Nov 1) if you'd like to go over it with the group.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

Ewww, that sounds like work.

I suppose if there's nothing more pressing/important to cover, I can spend a few minutes on it.

stscijgbot-jp commented 1 year ago

Comment by Anton Koekemoer on JIRA:

thanks! (no further work needed really, the slides you put here are fine already)

stscijgbot-jp commented 1 year ago

Comment by Everett Schlawin on JIRA:

After determining low/high bias values, should an additional step be added to enforce periodicity in the cycle for integrations 1+ ie throwing out outliers in the regular pattern? Maybe not necessary if the identification of low/high is reliable.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

Everett Schlawin - You would still need to determine for 2048/NFF_ROWS_RESET integrations to get into the cycle since we don't seem to have a deterministic way of figuring out where we started. So at that point, I'm not sure it makes sense to switch between methods. Maybe as an 'inline' check on the algorithm? Or maybe what Eddie Bergeron and Michael Regan are coming up with/proposing will make this easier.

stscijgbot-jp commented 1 year ago

Comment by Kevin Volk on JIRA:

After the January meeting Andre Martel checked this for NIRISS.  We did see these sorts of effects in the ground testing when we used 512 row rolling resets.  We decided to go to 2048 row resets per integration, and that removed the effects we saw at the time.  The OTIS testing and now the on-orbit data do not show the type of effects that Karl has described above.

stscijgbot-jp commented 1 year ago

Comment by Karl Misselt on JIRA:

Kevin Volk on 04/07/2023 - Can you clarify? Do you only have NFF_ROW_RESETS=2048 for all possible subarrays? I would expect to only see offsets for cases where NFF_ROWS_RESETS < 2048, and would be very surprised to not see them in NFF_ROWS_RESETS<2048 cases. We would have liked to set everything to 2048, but with small subarrays, the duty cycle gets a bit much.

stscijgbot-jp commented 1 year ago

Comment by Kevin Volk on JIRA:

Yes we have full 2048 row resets per integration for even the SUB80 subarray or SUB64 subarrays.