pcdshub / hutch-python

Launcher and config reader for LCLS interactive IPython sessions
https://pcdshub.github.io/hutch-python/
Other
0 stars 18 forks source link

Load Additional "Beamlines" into hutch python session #334

Open slactjohnson opened 2 years ago

slactjohnson commented 2 years ago

Expected Behavior

It would be nice to be able to load devices from Happi with a bit more granularity than we currently have.

Current Behavior

Currently hutches pour every device that they would ever want to use either into the happi database under the same "beamline" location. This is sufficient for now, but I can see issues cropping up once hutches have more than one interaction point (e.g. IP1 and IP2).

Possible Solution

Add an "endstation" flag or similar to happi and/or the hutch python CLI. This way, hutches could configure beamline devices shared among endstations as "beamline" devices, and devices specific to endstations as "endstation" devices.

There could be some utility to accepting more than one of these "endstation" or "area" flags (see context below).

I'm open to any other ideas/solutions that would achieve something similar.

Context

I'm trying to provide a way to load the MODS devices into laspython. Currently, each MODS is setup as a separate "beamline" in happi to separate the devices in LUCID screens, e.g. beamline=IP1_MODS, beamline=CRIX_MODS. However, there are also some devices in the las "beamline" that they would want in their python sessions, such as beam transport related devices. In the future I can also see wanting to add devices associated with whatever specific laser source is allocated to the hutch as well. It's trivial to create a laspython session that loads a specific MODS environment, but it would be nice to have a clean way to do this and load the common las devices as well, which I do not see a way to do currently.

Your Environment

laspython

ZLLentz commented 2 years ago

This is a very good idea.

For hutch-python in isolation it'd probably be enough to add config options for loading multiple beamlines. We may even want to take it a step further and allow full control of the happi designations. Would we need to have different variants for one instrument? Using rix an example, do we need to have a section in the config that looks like:

happi_variants:
    - ip1:
       - queries:
         - beamline=IP1_MODS
         - beamline=RIX
       - use_lightpath: False
    - crix:
       - queries:
         - beamline=CRIX_MODS
         - beamline=RIX
       - use_lightpath: False

And then we do some magic to allow rix3 ip1 and rix3 crix to access the different variants? Or is enough to do something like

happi_queries:
  - beamline=RIX
  - beamline=IP1_MODS
  - beamline=CRIX_MODS

One neat benefit to these is that beamlines can opt out of loading upstream devices in their session. You could even have a bare variant that skips all happi loading.

For apps in general I think a happi "endstation" selection in the schema is appropriate. You could also use this key when setting up your hutch-python happi queries, or we could use them to make lucid more accommodating.

slactjohnson commented 2 years ago

I like the above setup a lot. I can imagine different variants for one instrument or even one endstation. Examples: IP1 LAMP vs IP1 CVMI vs IP1 MRCO, various (non-)standard configs that wander into and out of hutches at a reasonable rate, etc.