Closed prjemian closed 2 years ago
With the synApps docker IOC and prefix sky:
, create a happi database (json text file) that describes 16 motors and one scaler.
#!/usr/bin/env python
"""
create_happi_db.py: create the JSON file for happi
========== ==============================
item description
========== ==============================
IOC prefix sky:
motors 16: sky:m1 .. sky:m16
scaler 1: sky:scaler1
========== ==============================
"""
# extracted lines
from apstools.utils import *
import json
import re
logging.basicConfig(level=logging.WARNING)
logger = logging.getLogger(__name__)
HAPPI_FILE = "sky_db.json"
IOC_PREFIX = "sky:"
MASTER_PREFIX = "zz_" # Why "zz_"? Sorts last, makes happi happy!
def prefix_to_mnemonic(prefix):
"""
make an IOC prefix safe to use with the happi name attribute
ioc: --> ioc_
AD42: --> ad42_
13ID:Sim1: --> zz_13id_sim1_
XF:11IDB-BI{Cam:09} --> xf_11idb_bi_cam_09_
"""
# required regexp from happi.item
compliant_pattern = r'^[a-z][a-z\_0-9]*'
replacement = '_'
safer = prefix.lower()
match = re.match(compliant_pattern, safer)
if match is None: # Fix for IOC prefixes that start with other characters
prefix = MASTER_PREFIX + prefix
safer = prefix.lower()
match = re.match(compliant_pattern, safer)
last = match.span()[-1]
while last < len(prefix):
safer = match.group(0) + replacement + safer[last+1:]
match = re.match(compliant_pattern, safer)
last = match.span()[-1]
return safer
def motors(prefix="ioc:", num=8):
"""
create ophyd motors provided by iocsky
"""
db = dict()
for i in range(num):
mne = prefix_to_mnemonic(prefix).rstrip("_") + f"_m{i+1}"
entry = dict(
documentation = f"{prefix} motor {i+1}",
prefix = f"{prefix}m{i+1}",
_id = mne,
name = mne,
active = True,
args = ['{{prefix}}'],
kwargs = dict(
name = '{{name}}',
labels = ["motors",]
),
type = 'OphydItem',
device_class = "ophyd.EpicsMotor",
)
db[mne] = entry
return db
def scaler(prefix="ioc:", num=1):
"""
create ophyd scaler ``num`` provided by iocsky
"""
db = dict()
mne = prefix_to_mnemonic(prefix).rstrip("_") + f"_scaler{num}"
entry = dict(
documentation = f"{prefix} scaler {num}",
prefix = f"{prefix}scaler{num}",
_id = mne,
name = mne,
active = True,
args = ['{{prefix}}'],
kwargs = dict(
name = '{{name}}',
labels = ["detectors",]
),
type = 'OphydItem',
device_class = "ophyd.scaler.ScalerCH",
)
db[mne] = entry
return db
def main():
ioc = {}
ioc.update(motors(IOC_PREFIX, 16))
ioc.update(scaler(IOC_PREFIX, 1))
print(dictionary_table(ioc))
with open(HAPPI_FILE, "w") as fp:
json.dump(ioc, fp, indent=2)
if __name__ == "__main__":
main()
read the happi database (json text file) and load the created ophyd objects into the global namespace, then demonstrate them
#!/usr/bin/env python
# extracted lines
from apstools.utils import *
import happi
import happi.loader
import logging
logging.basicConfig(level=logging.WARNING)
logger = logging.getLogger(__name__)
HAPPI_FILE = "sky_db.json"
def get_happi_as_globals(happi_config_file):
hclient = happi.Client(path=HAPPI_FILE)
devs = {
v.name: v
for v in [
happi.loader.from_container(_)
for _ in hclient.all_items
]
}
return devs
devs = get_happi_as_globals(HAPPI_FILE)
print(f"{len(devs)} devices loaded.")
globals().update(devs)
device_read2table(devs["sky_m1"])
sky_scaler1.channels.chan01.s = "clock"
sky_scaler1.channels.chan02.s = "counter"
sky_scaler1.channels.chan03.s = "monitor"
sky_scaler1.select_channels(None)
sky_scaler1.preset_time.put(2.5)
status = sky_scaler1.trigger()
ophyd.wait(status)
sky_scaler1.preset_time.put(1)
device_read2table(sky_scaler1)
FWIW: So far, this does not demonstrate why such a database is worthwhile since it is a lot more code, just to create the objects. The intrinsic value of this move will become apparent later.
Ok, I cheated a bit here and used this to load the devices: globals().update(get_happi_as_globals(HAPPI_FILE))
so devs
is not defined in the global space.
In [21]: device_read2table(sky_m1)
==================== ===== ==========================
name value timestamp
==================== ===== ==========================
sky_m1 0.0 2020-08-26 15:54:47.675602
sky_m1_user_setpoint 0.0 2020-08-26 15:54:47.675602
==================== ===== ==========================
Out[21]: <pyRestTable.rest_table.Table at 0x7fb2d96fb100>
In [21]: device_read2table(sky_m1)
==================== ===== ==========================
name value timestamp
==================== ===== ==========================
sky_m1 0.0 2020-08-26 15:54:47.675602
sky_m1_user_setpoint 0.0 2020-08-26 15:54:47.675602
==================== ===== ==========================
Out[21]: <pyRestTable.rest_table.Table at 0x7fb2d96fb100>
In [22]: sky_scaler1.channels.chan02.chname.put("clock")
...: sky_scaler1.channels.chan02.chname.put("counter")
...: sky_scaler1.channels.chan03.chname.put("monitor")
...: sky_scaler1.select_channels(None)
...:
...: sky_scaler1.preset_time.put(2.5)
...: status = sky_scaler1.trigger()
...: ophyd.wait(status)
...: sky_scaler1.preset_time.put(1)
...:
...: device_read2table(sky_scaler1)
================ ========== ==========================
name value timestamp
================ ========== ==========================
26000000.0 2020-08-26 16:16:29.246043
counter 10.0 2020-08-26 16:16:29.246043
monitor 13.0 2020-08-26 16:16:29.246043
sky_scaler1_time 2.6 2020-08-26 16:16:29.246043
================ ========== ==========================
Out[22]: <pyRestTable.rest_table.Table at 0x7fb2dabc8c10>
Try to scan scaler1 vs m1: RE(bp.scan([sky_scaler1], sky_m1, -1, 0, 5))
and a famously long error results:
The scaler has problems, won't ct
:
As simple as this?
In [28]: sky_scaler1.channels.chan01.chname.get()
Out[28]: ''
In [29]: sky_scaler1.channels.chan01.chname.put("clock")
In [30]: ct
nope, same problem
perhaps: sky_scaler1.channels.chan01.s.name = "clock"
?
Yes.
Now it counts:
In [44]: sky_scaler1.channels.chan01.s.name = "clock"
In [45]: ct
[This data will not be saved. Use the RunEngine to collect data.]
clock 11000000.0
counter 5.0
sky_scaler1_time 1.1
and scans:
In [46]: RE(bp.scan([sky_scaler1], sky_m1, -1, 0, 5))
Transient Scan ID: 3844 Time: 2020-08-26 16:30:39
Persistent Unique Scan ID: 'e4180f3d-4213-4223-93ff-649eb9a870eb'
New stream: 'primary'
+-----------+------------+------------+------------+------------+
| seq_num | time | sky_m1 | clock | counter |
+-----------+------------+------------+------------+------------+
| 1 | 16:30:40.9 | -1.00000 | 11000000 | 5 |
| 2 | 16:30:42.5 | -0.75000 | 11000000 | 3 |
| 3 | 16:30:44.1 | -0.50000 | 11000000 | 6 |
| 4 | 16:30:45.7 | -0.25000 | 11000000 | 5 |
| 5 | 16:30:47.3 | 0.00000 | 11000000 | 5 |
+-----------+------------+------------+------------+------------+
generator scan ['e4180f3d'] (scan num: 3844)
Out[46]: ('e4180f3d-4213-4223-93ff-649eb9a870eb',)
The scalars have either too much or too little magic on the name / chname syncing...
Updated the example code above.
The revised way is more intuitive but (as you saw) not what I expected. Deep dive showed .s.name
is a property. Wahoo! Love those. Learning just how useful is the dir(obj)
command for debugging.
Kicking this down the road. Not enough interest to solve this now.
Use [
happi.loader
]() to define ophyd objects.