Currently, preliminary data reduction is managed by local Python processes running via cron tasks. These processes could be executed as APS Data Management workflows on demand from bluesky, as a run is operating or afterwards as the team decides.
Integration of data processing with DM has two principle parts:
the workflow script itself (needs assist from APS DM team)
the data processing code to be run in the workflow (the USAXS team)
Due to the differences in the type of data acquired, we'll need separate workflows and data processing for each of these types of measurement:
[ ] workflow for USAXS step scan
[ ] workflow for USAXS fly scan
[ ] workflow for SAXS or WAXS scan (area detector image acquisition)
The code for these steps is available but probably needs adaptation to this new strategy:
[ ] data processing for USAXS step scan
[ ] data processing for USAXS fly scan
[ ] data processing for SAXS or WAXS scan (area detector image acquisition)
Decisions will need to be made regarding where raw and processed data is stored (local or Voyager) and the formats (probably Bluesky databroker and HDF5). Needs consult with teams from USAXS and APS DM. Then implement the decisions in bluesky controls.
Currently, preliminary data reduction is managed by local Python processes running via cron tasks. These processes could be executed as APS Data Management workflows on demand from bluesky, as a run is operating or afterwards as the team decides.
Integration of data processing with DM has two principle parts:
Due to the differences in the type of data acquired, we'll need separate workflows and data processing for each of these types of measurement:
The code for these steps is available but probably needs adaptation to this new strategy:
Decisions will need to be made regarding where raw and processed data is stored (local or Voyager) and the formats (probably Bluesky databroker and HDF5). Needs consult with teams from USAXS and APS DM. Then implement the decisions in bluesky controls.