This adds ess.reduce.streaming, mainly providing StreamProcessor. This is intended for use with Beamlime, processing chunks of detector and monitor event-data form a stream.
For context, this is how this could be used with a workflow from ESSsans:
from ess.reduce import streaming
streaming_wf = streaming.StreamProcessor(
base_workflow=workflow,
dynamic_keys=(
NeXusMonitorEventData[SampleRun, Incident],
NeXusDetectorEventData[SampleRun],
),
target_keys=(IofQ[SampleRun],),
accumulators={
ReducedQ[SampleRun, Numerator]:streaming.RollingAccumulator(window=5),
ReducedQ[SampleRun, Denominator]:streaming.RollingAccumulator(window=5),
},
)
# This Loki data has different det and mon time scales
det_stride = 60
mon_stride = 1
for i in range(100):
# assume we have loaded events manually somehow, this is not using an actual stream of events
det_chunk = det_events[det_stride * i : det_stride * (i + 1)].copy()
mon_chunk = mon_events[mon_stride * i : mon_stride * (i + 1)].copy()
# simulate temporary loss of detector data
if 20 < i < 30:
det_chunk *= 0.0
results = streaming_wf.add_chunk(
{
NeXusDetectorEventData[SampleRun]: det_chunk,
NeXusMonitorEventData[SampleRun, Incident]: mon_chunk,
}
)
fig.update({artist: results[IofQ[SampleRun]]})
fig.fig.canvas.draw()
fig.fig.canvas.flush_events()
This adds
ess.reduce.streaming
, mainly providingStreamProcessor
. This is intended for use with Beamlime, processing chunks of detector and monitor event-data form a stream.For context, this is how this could be used with a workflow from ESSsans: