Here is an incomplete list of notes for a rollout of Sciline:
Handling of binned or histogrammed data is currently intertwined, many functions handle both. Need to figure out if we should have dedicated domain types to disentangle this.
sans.to_I_of_Q: This function should essentially be removed, most of its content are things such as mapping inputs to other functions, which is what Sciline was made for.
sans.preprocess_monitor_data: Handling for dict/group of monitors should probably be replaced by making the function the Sciline's mechanism for generic providers.
sans.preprocess_monitor_data does a lot of "optional" processing steps. This may be the completely wrong paradigm (or not) when using Sciline. For example, it optionally subtracts a background. Should we have this function return a generic PreprocessedMonitor, or be more specific with BackgroundSubtractedMonitor, which would allow for (and enforce) a clearer workflow representation? Note that this is not the only example.
A number of functions take optional parameters. Consider if scipp/sciline#40 can address this, or if optional parameters are a design pattern that does not play nicely with Sciline.
sans.beam_center: This sets up and runs a sort of reduction workflow. Would this benefit from using Sciline internally? It will probably be affected in any case when we refactor the other parts of the reduction. For example, calls to convert_to_q_and_merge_spectra (which would be generic already since it needs to support data and norm) in iofq_in_quantrands also needs to operate by quadrant, which can be handled with the same mechanism. In any case, the inputs the the function would be provided by the outer workflow. Overall, it is likely that the entire file may be restructured substantially. It may be advisable to skip this until the workflow without beam-center-finding is refactored.
Here is an incomplete list of notes for a rollout of Sciline:
sans.to_I_of_Q
: This function should essentially be removed, most of its content are things such as mapping inputs to other functions, which is what Sciline was made for.sans.preprocess_monitor_data
: Handling for dict/group of monitors should probably be replaced by making the function the Sciline's mechanism for generic providers.sans.preprocess_monitor_data
does a lot of "optional" processing steps. This may be the completely wrong paradigm (or not) when using Sciline. For example, it optionally subtracts a background. Should we have this function return a genericPreprocessedMonitor
, or be more specific withBackgroundSubtractedMonitor
, which would allow for (and enforce) a clearer workflow representation? Note that this is not the only example.sans.beam_center
: This sets up and runs a sort of reduction workflow. Would this benefit from using Sciline internally? It will probably be affected in any case when we refactor the other parts of the reduction. For example, calls toconvert_to_q_and_merge_spectra
(which would be generic already since it needs to support data and norm) iniofq_in_quantrands
also needs to operate by quadrant, which can be handled with the same mechanism. In any case, the inputs the the function would be provided by the outer workflow. Overall, it is likely that the entire file may be restructured substantially. It may be advisable to skip this until the workflow without beam-center-finding is refactored.