Open dBeau opened 1 year ago
@dBeau It seems to me that the idea of using one websocket client for different tasks is not very correct. Maybe it makes sense to use one websocket client for the Stream Deck application, and the other for your tasks?
For these purposes, there is a decorator in_separate_thread
, which can be imported like this: from streamdeck_sdk import in_separate_thread
. It will allow you to run your websocket client (synchronous or asynchronous) in a separate thread. It can also be used for some kind of monitoring, such as CPU load information.
Here is an example of what it might look like:
import asyncio
import random
from streamdeck_sdk import in_separate_thread, StreamDeck, logger
async def async_printer(i: int):
sleep_time = random.random() * 100
await asyncio.sleep(sleep_time)
logger.info(f"async_printer: {i=}, {sleep_time=}")
async def asynchronous():
tasks = [asyncio.ensure_future(async_printer(i=i)) for i in range(100)]
await asyncio.wait(tasks)
@in_separate_thread(daemon=True)
def run_async_printer():
event_loop = asyncio.new_event_loop()
event_loop.run_until_complete(asynchronous())
run_async_printer()
if __name__ == '__main__':
import os
from pathlib import Path
PLUGIN_LOGS_DIR_PATH: Path = Path(os.environ["PLUGIN_LOGS_DIR_PATH"])
PLUGIN_NAME: str = os.environ["PLUGIN_NAME"]
LOG_FILE_PATH: Path = PLUGIN_LOGS_DIR_PATH / Path(f"{PLUGIN_NAME}.log")
StreamDeck(
log_file=LOG_FILE_PATH,
).run()
This code will write messages to the log from streamdeck_sdk
(for example, if you click on a key that has an action from the plugin) and from the async_printer
function.
The @in_separate_thread
decorator can work with class methods. For example, if depending on the monitoring data you need to change the state of the key, change the icon, or do something else, then you can make some functions for monitoring in your Action class in order to have access to class attributes and methods. Then wrap the main function to start monitoring with the decorator @in_separate_thread(daemon=True)
and call it after creating an instance of your Action. Here is an example:
from streamdeck_sdk import in_separate_thread, StreamDeck, Action
class MyAction(Action):
UUID = "myaction"
@in_separate_thread(daemon=True)
def run_monitoring(self):
...
my_action = MyAction()
my_action.run_monitoring()
if __name__ == '__main__':
StreamDeck(
actions=[
my_action,
]
).run()
Maybe this will help you more in solving your problem?
@gri-gus , thanks much for your well thought out answer. You are correct that multiple threads can be used to solve my problem. However, I like to avoid threads as much as possible so as to not worry about interactions between them. ...suddenly, this turns into a discussion of design philosophy.
To be clearer though (perhaps?) I wasnt suggesting using a single websocket client for different tasks. I'm not sure what that would even mean. However, multiple websocket clients in a single program would, however, make perfectly good sense to me. In my case though, there is just one websocket client and that is the one used used by streamdeck-sdk to talk to the Stream Deck application. The other sockets I was refering to were good ole TCP streams.
My goal is to have a single event loop service all of the I/O for the plug-in. The rel library is one way of doing this. The asyncio library, currently very popular, also has facilities for doing this. I might go so far as to suggest that this is why the asyncio library exists. In effect, it provides an alternative to using threads in I/O heavy applications. To succeed in this goal it's necessary for libraries, like streamdeck-sdk, to either work directly with asyncio or at the very least expose their I/O in a generic way for use by other async libraries (curio, trio, AnyIO, pyevent, rel, even twisted are all examples). The websocket library does this by allowing the user to provide a 'dispatcher' to its run_forever() method. In effect, it's happy to not worry about checking its TCP sockets for read/writes and hands that responsibilty off to the dispatcher.
Again, the suggestion you provided will work. Thanks much for that and for streamdeck-sdk too.
*Edit: Actually, I can't seem to demonstrate that the monitoring thread is actually running. To test this I attempted to open and write to a hardcoded path from within run_monitoring
but the file is never written.
I'm having trouble with logging when performing asyncio as described. I thought this may have been because logging was not yet set up during the init of StreamDeck so I first constructed the StreamDeck instance:
my_action = MyAction()
#Intializing StreamDeck here so that logging is configured before calling run_monitoring
stream_deck = StreamDeck(
actions=[
my_action,
],
log_file=settings.LOG_FILE_PATH,
log_level=settings.LOG_LEVEL,
log_backup_count=1,
)
my_action.run_monitoring()
if __name__ == '__main__':
stream_deck.run()
@in_separate_thread(daemon=True)
async def run_monitoring(self):
logger = logging.getLogger(__name__)
logger.debug('run_monitoring logging test')
My logging from elsewhere within MyAction works. Any ideas?
https://github.com/gri-gus/streamdeck-python-sdk/issues/2#issuecomment-2344429712 I'm not sure about the correct operation of the asynchronous function in the in_separate_thread decorator. And in general, so far everything has been done for synchronous code. An asynchronous version will come later.
The plug-in I am writing makes long running socket connections to other services. These connections need to be read from and written to inorder to maintain the state needed by the plug-in. The existing StreamDeck.run() method requies the use of the websocket run_forever() method and does not provide access to its event loop or for selecting from other socket connections. It would be helpful if either the event loop was exposed or if the StreamDeck class could use an externally provided event loop.
A quick fix can be employed by the StreamDeck user with a simple modification. The idea is to import the rel module and pass its dispatcher to the StreamDeck.run() method. The websocket examples show how this works.
Here's some plug-in code assuming the modifications have been made.
then in sdk.py two changes are needed, the first is for the run() method to accept the dispatcher parameter and the second is for it to make use of it
self.ws.run_forever(dispatcher=dispatcher)
This is a very simple change based on some older technology that has largely been replaced by asyncio. An asyncio solution could be "better", but just two lines and no breakage is hard to beat.