project8 / dripline

Slow controls for medium scale physics experiments based on AMQP centralized messaging
http://www.project8.org/dripline
1 stars 0 forks source link

send_request should make use of the async backend #135

Closed laroque closed 7 years ago

laroque commented 9 years ago

WARNING!... this isn't actually a starter project, it is kind of advanced.

Currently the sending a request involves using threading and a new connection to send a message and get a reply. This feels super dirty. I'd like to do something like:

timeout_time = datetime.datetime.now() + datetime.timedelta(seconds=timeout_in_seconds)
correlation_id = uuid.uuid4()
self.send_message(<your request message stuff>, correlation_id=correlation_id)
while self._results[correlation_id] is None and datetime.datetime.now < timeout_time:
     #some yielding block that lets the service process any new messages

... but I haven't got a solution for the yielding code block

should just send a request and then watch some local dictionary of responses for one with a matched correlation_id... need to have solutions to two issues: 1) still need to support timeout for getting the reply 2) need to make sure data-structures don't grow without bound if either replies collect but never get sent anywhere, or records of replies awaiting with reply do not grow similarly

laroque commented 9 years ago

ugh... why do I keep doing this to myself, the send_request method of service is blocking, which means that it cannot then handle_reply because it is still trying to send_request.... python 3's yield from syntax may offer a nice solution to this, I'll play with pika 0.10 and python 3.4 at some point, but for now I should give up

guiguem commented 7 years ago

This issue was moved to project8/dripline-python#3