NSLS-II / Bug-Reports

Unified issue-tracker for bugs in the data acquisition, management, and analysis software at NSLS-II
BSD 3-Clause "New" or "Revised" License
2 stars 5 forks source link

CSX: issue with data key collision #190

Closed cmazzoli closed 6 years ago

cmazzoli commented 6 years ago
In [155]: %run -i startup.py

In [156]: tardis.position
Out[156]: TardisPseudoPos(h=0.029868826994677654, k=-0.09422982436624192, l=0.9954125133565489)

In [157]: fccd.hints
Out[157]: {'fields': ['fccd_stats3_total', 'fccd_stats4_total']}

In [158]: #RE(asc(dets,pgm.energy,910,960,51))

In [159]: RE(mv(fccd.exposure,(1,0,1)))
Out[159]: ()

In [160]: RE(mv(fccd.exposure,(2,0,1)))
Out[160]: ()

In [161]: RE(asc(dets,pgm.energy,910,960,51))
Transient Scan ID: 99056     Time: 2018/03/28 13:13:04
Persistent Unique Scan ID: 'c8ed7f53-8aac-436b-8beb-80f21d6700cd'
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
~/Beamline/ScienceComm/2018_03_ZP/startup.py in <module>()
----> 1 RE(asc(dets,pgm.energy,910,960,51))

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/run_engine.py in __call__(self, *args, **metadata_kw)
    667                     # it (unless it is a canceled error)
    668                     if exc is not None:
--> 669                         raise exc
    670 
    671             if self._interrupted:

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/run_engine.py in _run(self)
   1115             self.log.error("Run aborted")
   1116             self.log.error("%r", err)
-> 1117             raise err
   1118         finally:
   1119             # Some done_callbacks may still be alive in other threads.

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/run_engine.py in _run(self)
   1014                         resp = self._response_stack.pop()
   1015                         try:
-> 1016                             msg = self._plan_stack[-1].send(resp)
   1017                         # We have exhausted the top generator
   1018                         except StopIteration:

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in __call__(self, plan)
   1236         plan = monitor_during_wrapper(plan, self.monitors)
   1237         plan = baseline_wrapper(plan, self.baseline)
-> 1238         return (yield from plan)

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in baseline_wrapper(plan, devices, name)
   1092         return (yield from plan)
   1093     else:
-> 1094         return (yield from plan_mutator(plan, insert_baseline))
   1095 
   1096 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    136                     continue
    137                 else:
--> 138                     raise ex
    139         # if inserting / mutating, put new generator on the stack
    140         # and replace the current msg with the first element from the

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     89             ret = result_stack.pop()
     90             try:
---> 91                 msg = plan_stack[-1].send(ret)
     92             except StopIteration as e:
     93                 # discard the exhausted generator

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in monitor_during_wrapper(plan, signals)
    722     plan1 = plan_mutator(plan, insert_after_open)
    723     plan2 = plan_mutator(plan1, insert_before_close)
--> 724     return (yield from plan2)
    725 
    726 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    136                     continue
    137                 else:
--> 138                     raise ex
    139         # if inserting / mutating, put new generator on the stack
    140         # and replace the current msg with the first element from the

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     89             ret = result_stack.pop()
     90             try:
---> 91                 msg = plan_stack[-1].send(ret)
     92             except StopIteration as e:
     93                 # discard the exhausted generator

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    136                     continue
    137                 else:
--> 138                     raise ex
    139         # if inserting / mutating, put new generator on the stack
    140         # and replace the current msg with the first element from the

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     89             ret = result_stack.pop()
     90             try:
---> 91                 msg = plan_stack[-1].send(ret)
     92             except StopIteration as e:
     93                 # discard the exhausted generator

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in fly_during_wrapper(plan, flyers)
    780     plan1 = plan_mutator(plan, insert_after_open)
    781     plan2 = plan_mutator(plan1, insert_before_close)
--> 782     return (yield from plan2)
    783 
    784 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    136                     continue
    137                 else:
--> 138                     raise ex
    139         # if inserting / mutating, put new generator on the stack
    140         # and replace the current msg with the first element from the

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     89             ret = result_stack.pop()
     90             try:
---> 91                 msg = plan_stack[-1].send(ret)
     92             except StopIteration as e:
     93                 # discard the exhausted generator

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    136                     continue
    137                 else:
--> 138                     raise ex
    139         # if inserting / mutating, put new generator on the stack
    140         # and replace the current msg with the first element from the

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     89             ret = result_stack.pop()
     90             try:
---> 91                 msg = plan_stack[-1].send(ret)
     92             except StopIteration as e:
     93                 # discard the exhausted generator

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/plans.py in scan(detectors, motor, start, stop, num, per_step, md)
    261             yield from per_step(detectors, motor, step)
    262 
--> 263     return (yield from inner_scan())
    264 
    265 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/utils.py in dec_inner(*inner_args, **inner_kwargs)
    962                 plan = gen_func(*inner_args, **inner_kwargs)
    963                 plan = wrapper(plan, *args, **kwargs)
--> 964                 return (yield from plan)
    965             return dec_inner
    966         return dec

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in stage_wrapper(plan, devices)
    871         return (yield from plan)
    872 
--> 873     return (yield from finalize_wrapper(inner(), unstage_devices()))
    874 
    875 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in finalize_wrapper(plan, final_plan, pause_for_debug)
    432     cleanup = True
    433     try:
--> 434         ret = yield from plan
    435     except GeneratorExit:
    436         cleanup = False

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in inner()
    869     def inner():
    870         yield from stage_devices()
--> 871         return (yield from plan)
    872 
    873     return (yield from finalize_wrapper(inner(), unstage_devices()))

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/utils.py in dec_inner(*inner_args, **inner_kwargs)
    962                 plan = gen_func(*inner_args, **inner_kwargs)
    963                 plan = wrapper(plan, *args, **kwargs)
--> 964                 return (yield from plan)
    965             return dec_inner
    966         return dec

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in run_wrapper(plan, md)
    284         metadata to be passed into the 'open_run' message
    285     """
--> 286     rs_uid = yield from open_run(md)
    287 
    288     def except_plan(e):

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/plan_stubs.py in open_run(md)
    666     :func:`bluesky.plans.close_run`
    667     """
--> 668     return (yield Msg('open_run', **(md or {})))
    669 
    670 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     73             # if we have a stashed exception, pass it along
     74             try:
---> 75                 msg = plan_stack[-1].throw(exception)
     76             except Exception as e:
     77                 # if we catch an exception,

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/utils.py in single_gen(msg)
    137         the input message
    138     '''
--> 139     return (yield msg)
    140 
    141 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    161         try:
    162             # yield out the 'current message' and collect the return
--> 163             inner_ret = yield msg
    164         except GeneratorExit:
    165             # special case GeneratorExit.  We must clean up all of our plans

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    161         try:
    162             # yield out the 'current message' and collect the return
--> 163             inner_ret = yield msg
    164         except GeneratorExit:
    165             # special case GeneratorExit.  We must clean up all of our plans

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     73             # if we have a stashed exception, pass it along
     74             try:
---> 75                 msg = plan_stack[-1].throw(exception)
     76             except Exception as e:
     77                 # if we catch an exception,

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/utils.py in single_gen(msg)
    137         the input message
    138     '''
--> 139     return (yield msg)
    140 
    141 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    161         try:
    162             # yield out the 'current message' and collect the return
--> 163             inner_ret = yield msg
    164         except GeneratorExit:
    165             # special case GeneratorExit.  We must clean up all of our plans

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    161         try:
    162             # yield out the 'current message' and collect the return
--> 163             inner_ret = yield msg
    164         except GeneratorExit:
    165             # special case GeneratorExit.  We must clean up all of our plans

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
     73             # if we have a stashed exception, pass it along
     74             try:
---> 75                 msg = plan_stack[-1].throw(exception)
     76             except Exception as e:
     77                 # if we catch an exception,

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/plan_stubs.py in trigger_and_read(devices, name)
    755     from .preprocessors import rewindable_wrapper
    756     return (yield from rewindable_wrapper(inner_trigger_and_read(),
--> 757                                           rewindable))
    758 
    759 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in rewindable_wrapper(plan, rewindable)
    612                                             restore_rewindable()))
    613     else:
--> 614         return (yield from plan)
    615 
    616 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/plan_stubs.py in inner_trigger_and_read()
    748         ret = {}  # collect and return readings to give plan access to them
    749         for obj in devices:
--> 750             reading = (yield from read(obj))
    751             if reading is not None:
    752                 ret.update(reading)

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/plan_stubs.py in read(obj)
     85         Msg('read', obj)
     86     """
---> 87     return (yield Msg('read', obj))
     88 
     89 

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/preprocessors.py in plan_mutator(plan, msg_proc)
    161         try:
    162             # yield out the 'current message' and collect the return
--> 163             inner_ret = yield msg
    164         except GeneratorExit:
    165             # special case GeneratorExit.  We must clean up all of our plans

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/run_engine.py in _run(self)
   1065                         # exceptions (coming in via throw) can be
   1066                         # raised
-> 1067                         response = yield from coro(msg)
   1068                     # special case `CancelledError` and let the outer
   1069                     # exception block deal with it.

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/asyncio/coroutines.py in coro(*args, **kw)
    208         @functools.wraps(func)
    209         def coro(*args, **kw):
--> 210             res = func(*args, **kw)
    211             if (base_futures.isfuture(res) or inspect.isgenerator(res) or
    212                 isinstance(res, CoroWrapper)):

/opt/conda_envs/collection-2018-1.0.1/lib/python3.6/site-packages/bluesky/run_engine.py in _read(self, msg)
   1393                     raise ValueError("Data keys (field names) from {0!r} "
   1394                                      "collide with those from {1!r}"
-> 1395                                      "".format(obj, read_obj))
   1396 
   1397             # add this object to the cache of things we have read

ValueError: Data keys (field names) from NanoBundle(prefix='XF:23ID1-ES{Dif:Nano-Ax:', name='nanop', read_attrs=['tx', 'ty', 'tz', 'bx', 'by', 'bz'], configuration_attrs=['tx', 'ty', 'tz', 'bx', 'by', 'bz']) collide with those from NanoBundle(prefix='XF:23ID1-ES{Dif:Nano-Ax:', name='nanop', read_attrs=['tx', 'ty', 'tz', 'bx', 'by', 'bz'], configuration_attrs=['tx', 'ty', 'tz', 'bx', 'by', 'bz'])
mrakitin commented 6 years ago

That's possibly somehow related the the second bsui started after the first one was stopped with Ctrl+Z. Just a guess...

mrakitin commented 6 years ago

@tacaswell, thoughts?

jrmlhermitte commented 6 years ago

Not sure, but I would guess you have two separate instances of NanoBundle. RunEngine will cache results with a dictionary indexed by object (It could also just be the name string, I cannot recall right now, but I really think it was the actual object itself) My guess is if it recognizes a new object that's not the same, it will try to cache a new element in the dictionary, and in doing so will see colliding keys. I would look into that.

I am wondering if it is something deeper which might involve this disconnect logic here: https://github.com/NSLS-II-CSX/xf23id1_profiles/commit/756d53b7492c9a609651a5ef752cf68311c7c29f#diff-b8b6535fb5ab4adb58462bc8b158c90aR87

@tacaswell ?

tacaswell commented 6 years ago

You almost certainly have two instances of the nanop in dets, this is the expected and correct behavior.

This is not related to having more than one bsui running at the same time as they are different processes so can not talk to each other. I am very doubtful that this is related ripping the signals off.

@cmazzoli the fix is to make sure you do not have more than one instance of nanop in dets.

jrmlhermitte commented 6 years ago

After discussion with @cmazzoli the problem is re-running this file: https://github.com/NSLS-II-CSX/xf23id1_profiles/blob/master/profile_collection/startup/02-nanops.py#L122

they won't re-run the file

cmazzoli commented 6 years ago

Dear all, thanks for the help and the precious comments. Here we have a couple of questions and a procedure to develop, then. First of all, we edit files and re-run/edit them all the time (DAMA people coming on the beamline do the same..). This is done WITHOUT restarting BS as we are not always in the conditions of doing it.. Now, can somebody tell me why 02-nanop.py is apparently the only piece of code which is producing problems? Maybe it is related with: sd.baseline += [nanop] If so, we just "protected" it with Julien by adding:

# check if nanop already there and remove it
try:
    sd.baseline.remove(nanop):
except NameError:
    pass

Would this ok with everybody?

In general, as it was the case for us with nanopositioners being commissioned and showing problems, on beamlines we have a number of pieces of equipment which may be configured or not depending on the experiment, which might have to be removed on the fly because they produce problems, which have to be tweaked during the experiments because they show limits or problems or improvements and tuning are necessary... Are we saying that every time we perform these operations we need to restart BS?

On the same page, we have a problem with imported libraries and relative paths in config (.. profiles.. /startup ) files. What about the option of: importlib.reload ??

mrakitin commented 6 years ago

Yes, @cmazzoli, I think that's the place where you populate the list with the nanop multiple times. We need to be careful with how we delete it, there are multiple options, and .remove() removes just the first occurrence (this may be important if you already have multiple instances of nanop there), and NameError may not catch it -- you may need a ValueError. Is the order of the motors important for sd.baseline? Maybe you can search of its index in the list (idx = sd.baseline.index(nanop)), and then replace just this element (sd.baseline[idx] = nanop). Other thoughts?

I think it's always safer to restart bsui rather that rerunning particular modules (done with %run -i <name>.py). I am not sure if reloading imports helps in this case, %run does it for you, but you already have the vars for lists, etc. defined in your namespace. You may want to clear these vars on loading of the module.