Closed maartenl945 closed 5 years ago
I'm facing the same issue. The most useful tests would be asynchronous in my case too.
I tried to work around using robotframework-async but I either use it incorrectly or it just doesn't work - the second thread, in which MQTT communication is to take place, doesn't seem to start, produces nothing on stdout/stderr and causes Robot to hang infinitely waiting for that thread to finish.
Has anybody found any work around for this yet ?
I ended up using robot's Start Process to wrap mqtt CLI tools (e.g. mosquitto_sub/_pub) and abandoned this mqtt library
Thanks for that suggestion. Will look into it.
@randomsync We're now using the solution that was introduced in PR #9 which seems to work fine. The advantage compared to a wrap of mosquitto tools is that it is all in Python and therefore works on multiple platforms. I think it's indispensable for any serious use of the library to have some sort of asynchronous subscription to messages. It would be nice if this solution was made part of the official library.
@maartenl945 thanks for the tagging me as I didn't receive any notifications for this thread. It's been a while since I worked on this library so please bear with me as I review this PR.
This has been merged and released in v0.6.0. Thanks @janvanoverwalle
Great, thanks @randomsync and @janvanoverwalle !!
Hi,
I've just started using your library to test whether our product generates an MQTT message at the right moment. However, I noticed that the library currently only allows to subscribe to a topic and wait for a message within a single call. I would think that in most test cases it would be necessary to do a separate subscription and waiting for message. e.g. test steps would be:
Since steps 1 and 3 are only possible in a single library call (Subscribe) this cannot be done. There is no room to do step 2 in between in robot framework since all steps are executed sequentially.
Do you have any suggestion on how to do this with the library or does the library need to be extended for that ?
Regards, Maarten