aps-8id-dys / bluesky

XPCS bluesky instrument configuration
Other
2 stars 0 forks source link

Independent function for writing the nx hdf file after acquisition is complete #103

Open qzhang234 opened 1 week ago

qzhang234 commented 1 week ago

We would like to have an independent function for writing nexus file after the acquisition function (containing bp.count) is called. We would prefer to have a plain python function independent from Bluesky RE (and apstools, if possible) so that it is easier for beamline folks to develop, maintain and debug.

If we can have only just a template showing how it's done, I can populate the rest of the metadata fields in the hdf file upon request from Suresh and users.

prjemian commented 1 week ago

Build Python code that creates a metadata file given a bluesky run uid. Issue https://github.com/aps-8id-dys/bluesky/issues/106#issuecomment-2457471193 has some examples. Here is a working set:

B003_004.zip

AZjk commented 1 week ago

please consider the following template as a minimal requirement.

/entry/instrument/bluesky/metadata/I0 Dataset {SCALAR} /entry/instrument/bluesky/metadata/I1 Dataset {SCALAR} /entry/instrument/bluesky/metadata/X_energy Dataset {SCALAR} /entry/instrument/bluesky/metadata/absolute_cross_section_scale Dataset {SCALAR} /entry/instrument/bluesky/metadata/acquire_period Dataset {SCALAR} /entry/instrument/bluesky/metadata/acquire_time Dataset {SCALAR} /entry/instrument/bluesky/metadata/bcx Dataset {SCALAR} -> beam_center_x /entry/instrument/bluesky/metadata/bcy Dataset {SCALAR} -> beam_center_y** /entry/instrument/bluesky/metadata/beamline_id Dataset {SCALAR} /entry/instrument/bluesky/metadata/ccdx Dataset {SCALAR} /entry/instrument/bluesky/metadata/ccdx0 Dataset {SCALAR} /entry/instrument/bluesky/metadata/ccdy Dataset {SCALAR} /entry/instrument/bluesky/metadata/ccdy0 Dataset {SCALAR} /entry/instrument/bluesky/metadata/concise Dataset {SCALAR} /entry/instrument/bluesky/metadata/conda_prefix Dataset {SCALAR} /entry/instrument/bluesky/metadata/cycle Dataset {SCALAR} /entry/instrument/bluesky/metadata/dataDir Dataset {SCALAR} /entry/instrument/bluesky/metadata/data_management Dataset {SCALAR} /entry/instrument/bluesky/metadata/databroker_catalog Dataset {SCALAR} /entry/instrument/bluesky/metadata/datetime Dataset {SCALAR} /entry/instrument/bluesky/metadata/description Dataset {SCALAR} /entry/instrument/bluesky/metadata/det_dist Dataset {SCALAR} /entry/instrument/bluesky/metadata/detector_name Dataset {SCALAR} /entry/instrument/bluesky/metadata/detectors Dataset {SCALAR} /entry/instrument/bluesky/metadata/header Dataset {SCALAR} /entry/instrument/bluesky/metadata/hints Dataset {SCALAR} /entry/instrument/bluesky/metadata/iconfig Dataset {SCALAR} /entry/instrument/bluesky/metadata/incident_beam_size_nm_xy Dataset {SCALAR} /entry/instrument/bluesky/metadata/incident_energy_spread Dataset {SCALAR} /entry/instrument/bluesky/metadata/index Dataset {SCALAR} /entry/instrument/bluesky/metadata/instrument_name Dataset {SCALAR} /entry/instrument/bluesky/metadata/login_id Dataset {SCALAR} /entry/instrument/bluesky/metadata/metadatafile Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_capture Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_exposures Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_images Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_intervals Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_points Dataset {SCALAR} /entry/instrument/bluesky/metadata/num_triggers Dataset {SCALAR} /entry/instrument/bluesky/metadata/owner Dataset {SCALAR} /entry/instrument/bluesky/metadata/pid Dataset {SCALAR} /entry/instrument/bluesky/metadata/pix_dim_x Dataset {SCALAR} /entry/instrument/bluesky/metadata/pix_dim_y Dataset {SCALAR} /entry/instrument/bluesky/metadata/plan_args Dataset {SCALAR} /entry/instrument/bluesky/metadata/plan_name Dataset {SCALAR} /entry/instrument/bluesky/metadata/plan_type Dataset {SCALAR} /entry/instrument/bluesky/metadata/proposal_id Dataset {SCALAR} /entry/instrument/bluesky/metadata/qmap_file Dataset {SCALAR} /entry/instrument/bluesky/metadata/run_start_uid Dataset, same as /entry/entry_identifier /entry/instrument/bluesky/metadata/safe_title Dataset {SCALAR} _/entry/instrument/bluesky/metadata/t0 Dataset {SCALAR} remove, duplicate acquire_time /entry/instrument/bluesky/metadata/t1 Dataset {SCALAR} remove, duplicate acquire_period_** /entry/instrument/bluesky/metadata/title Dataset {SCALAR} /entry/instrument/bluesky/metadata/versions Dataset {SCALAR} /entry/instrument/bluesky/metadata/workflow Dataset {SCALAR} /entry/instrument/bluesky/metadata/xdim Dataset {SCALAR} /entry/instrument/bluesky/metadata/xpcs_header Dataset {SCALAR} /entry/instrument/bluesky/metadata/xpcs_index Dataset {SCALAR} /entry/instrument/bluesky/metadata/ydim Dataset {SCALAR}

prjemian commented 1 week ago

@AZjk advises this run is more representative of the full set of metadata keys needed:

prjemian commented 1 week ago

@AZjk was asked if any information is needed from this file at address that do not start with /entry/instrument/bluesky/metadata/.

Nope.

prjemian commented 1 week ago

Makes this easier then, since all content will come from the bluesky run's metadata dictionary. Places the responsibility with the bluesky session to provide all the expected metadata keys (listed above).

prjemian commented 2 days ago

@qzhang234 , @sureshnaps : Should I close both #107 and #108? It's not obvious that either of these satisfy your expectations.

qzhang234 commented 2 days ago

We can probably leave it open for now. I can still use the template and I will write the Nexus file per your instruction in the next couple of days. In the meantime I'll just put hard-coded numbers for whichever PV whose Ophyd implementation is not available yet.

Thanks!

QZ


From: Pete R Jemian @.> Sent: Wednesday, November 13, 2024 6:14:25 PM To: aps-8id-dys/bluesky @.> Cc: Zhang, Qingteng @.>; Assign @.> Subject: Re: [aps-8id-dys/bluesky] Independent function for writing the nx hdf file after acquisition is complete (Issue #103)

Should I close both #107 and #108? It's not obvious that either of these satisfy your expectations. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were assigned. Message ID: aps-8id-dys/bluesky/issues/103/2475082910@ github. com ZjQcmQRYFpfptBannerStart This Message Is From an External Sender This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd

Should I close both #107https://urldefense.us/v3/__https://github.com/aps-8id-dys/bluesky/pull/107__;!!G_uCfscf7eWS!cqKonioNgb8SSE-pJ0jzuyN2bIYNAgk2vEJgb91324UOwvl6GwbjmnoaX08oCCD5azsLjHhBRcedhJvyc-mLp5bpS1XwDg$ and #108https://urldefense.us/v3/__https://github.com/aps-8id-dys/bluesky/pull/108__;!!G_uCfscf7eWS!cqKonioNgb8SSE-pJ0jzuyN2bIYNAgk2vEJgb91324UOwvl6GwbjmnoaX08oCCD5azsLjHhBRcedhJvyc-mLp5aPSsdzVg$? It's not obvious that either of these satisfy your expectations.

— Reply to this email directly, view it on GitHubhttps://urldefense.us/v3/__https://github.com/aps-8id-dys/bluesky/issues/103*issuecomment-2475082910__;Iw!!G_uCfscf7eWS!cqKonioNgb8SSE-pJ0jzuyN2bIYNAgk2vEJgb91324UOwvl6GwbjmnoaX08oCCD5azsLjHhBRcedhJvyc-mLp5aZh0AV5g$, or unsubscribehttps://urldefense.us/v3/__https://github.com/notifications/unsubscribe-auth/ALPJBQXWYD6SANZTKWHOEED2APTODAVCNFSM6AAAAABRFBUWCWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZVGA4DEOJRGA__;!!G_uCfscf7eWS!cqKonioNgb8SSE-pJ0jzuyN2bIYNAgk2vEJgb91324UOwvl6GwbjmnoaX08oCCD5azsLjHhBRcedhJvyc-mLp5aQ_y2i7A$. You are receiving this because you were assigned.Message ID: @.***>

prjemian commented 1 day ago

The data assignments in b1b1c44 look OK, You're on the right track. Do we still need to provide data in the locations as Miaoqi described above?

Here are some of the problems with that code.

  1. You pushed that to the main branch. You should be working in a separate branch. (Perhaps a new one that references this issue.)
  2. You are assigning attributes to HDF5 groups that do not exist yet. Here's the first error you'd get when you run that code: KeyError: "Unable to synchronously open object (object 'entry' doesn't exist)" Make the NX_class assignments after you create all the datasets (and groups).
  3. The NeXus base classes have specific definitions. For example, use separate ⁠NXpositioner group for each positioner. Your code puts three positioners in one group.
  4. The NXmonitor group needs a single data field. Your code provides both I0 and I1. NXmonitor might not be the best choice for this data.