scipp / essreduce

Common functionality for ESS data reduction
https://scipp.github.io/essreduce/
BSD 3-Clause "New" or "Revised" License
1 stars 1 forks source link

Nexus workflow module: Handling time-dependent NXtransformations? #96

Open SimonHeybrock opened 2 months ago

SimonHeybrock commented 2 months ago

When component positions in a NeXus file are not fixed, this is represented as one or more transformations with a value dataset replaced by a time-series (NXlog). In this case it is not possible to compute, e.g., a unique sample_position or position coordinate when loading.

The current workflows simply fail, since scippnexus.compute_positions will not create a detector position coordinate — instead the surrounding DataGroup will have a time-dependent position DataArray. This is deliberately not multiplied into the pixel position offsets since this could and would lead to a massive memory use (in addition to leading to a coordinate with time-dependence, which is not allowed by Scipp unless the data has this coord, which it does not).

In practice, I think data reduction will have to be performed for various time intervals, with constant positions in each interval. We somehow need to facilitate handling this, extracting relevant information from various parts. This includes determining a unique position coordinate, as well as loading the correct chunk of events. The current GenericNeXusWorkflow and related components do not support this. We need to figure out a convenient and safe mechanism to make this work. There are a bunch of cases:

What many of these have in common is that intermediate workflow results may have to be combined into a single final result. For example, a moving single detector could be interpreted as an instrument with multiple detector banks.

For now, I think we should focus on the case of step-wise movement but keep the continuous scanning case in the back of our minds.

I think there are roughly two fundamentally different approaches that could be considered:

  1. Keep time dependent positions, when performing coordinate transformations perform a lookup from each neutron event's time to the corresponding position and sample_position. Conceptually this would mean changing the current def position(event_id): ... to def position(event_id, event_time_zero): .... As mentioned before, spelling the position out as an array can be prohibitively large, so this needs to be implemented as a sort of lookup function.
  2. Split run into "constant" sections, reduce each individually, accumulate/combine in later steps, once the position dependence has been removed.

I believe that even if we want to do 1.), there will be cases where 2.) is required for scientific reasons, e.g., since the Q-resolution may differ and the results from different detector positions should not actually be merged directly. We should therefore consider fully focusing on 2.), to see how far that approach will get us.

Relevant instruments:

Related:

SimonHeybrock commented 2 months ago

An easy subgoal that will likely cover many uses cases is to improve the loaders to take into account the time interval. Currently ess.reduce.nexus supports a PulseSelection but this is ignored when loading any NXlog. I propose to change (or extend) PulseSelection to a more generic time-interval selection. As motion logs may be noisy we will likely also need to think about thresholds, etc., or a way to define, e.g., a static position from a noisy position log (raising an error if it cannot be considered static).