Open raphaelauv opened 1 year ago
Thank you for the suggestion @raphaelauv, I'll bring this up!
Hi again @raphaelauv! Thank you for the suggestion, it is something we'd like to look into in the future, and if you have a PR then we'd love to take a look at that!
Dear @daveleek, I think the async support is essential but it's not backward compatible and if we can do it in a backward compatible mode, it would not benefit from asynchronous support. I think the better idea is to develop a dedicated module inside or externally. that uses some functionalities but provides async interfaces and processes. there's a lot of related examples in Python libraries such as database integration libs, 3th-party API clients, etc. How do you think about it?
@daveleek @snosratiershad --
Can you help me understand what APIs provided by the UnleashClient
might block the async thread?
For example, we make heavy use of is_enabled
in our app --
Will that ever block the main thread? Or is it always operating on a local cache and refreshing state happens on another thread?
Dear @philipbjorge, there is only one thread in SDK (even if we support async in the future, it's still one thread and a single-thread event loop). are you trying to use unleash-client-python via different processes in a single application? or I'm missing some info
@snosratiershad --
We're trying to use this client in a FastAPI application.
I want to make sure that calling is_enabled(...)
isn't going to block our main thread with things like blocking network calls.
I'm also interested in async support. In our synchronous apps we currently make use of a custom event_callback
that publishes messages for analytics. It'd be awesome if we could use async
event callbacks!
I think the httpx
project utilizes a good strategy for organizing both sync and async clients. Both the sync client and async client subclass a common base client. I think there would be enough shared logic that such an approach could potentially be fruitful here.
@snosratiershad, would such an approach be welcomed in this project?
After spending a little more time getting more acquainted with the code, one challenge I'm not sure how to resolve is doing non-blocking IO given the dependency on fcache
.
Unless I'm missing something I imagine the primary use case for an async client would be to call is_enabled
and get_variant
asynchronously. One could probably get away with synchronous client initialization and I think that would drastically simplify an async
-ish implementation.
I wonder if any other projects do anything like that.
@snosratiershad, would such an approach be welcomed in this project?
@yjabri I'm not a maintainer of the project, but I think aiohttp and httpx both have pros and cons; I'm okay with httpx (and prefer aiohttp in this case), but I'm afraid I have to disagree with updating the current request client to a httpx sync client. I'm looking for more single-responsible and separated async support. It's better to discuss this in a PR. I'll create a draft version and submit it. ETA: 10 Oct.
I'm not sure how to resolve is doing non-blocking IO given the dependency on fcache.
I think about https://github.com/aio-libs/aiocache, but it's not file-based. Do you think it's necessary to be file-based? I'll also discover https://github.com/grantjenks/python-diskcache and report here (it seems okay with the async event loop).
@sighphyre I'm going to take this issue and work on it. do you agree with a separate async module like UnleashAsyncClient inside the repo?
For what its worth, I have projects that rely on the file cache when using unleash in a multiprocessing setting.
I am also using UnleashClient in my FastAPI project.
Am I correct that since this project is using APScheduler with BackgroundExecutor that the refresh method is being run in a separate thread? And I assume, without being familiar with the codebase, that is_enabled
is not going to block.
In this case, except for initialize_client
which I could run in a separate thread, I can use UnleashClient without it blocking my event loop.
Is this right?
@sighphyre I'm going to take this issue and work on it. do you agree with a separate async module like UnleashAsyncClient inside the repo?
Hey @snosratiershad, that sounds sane to me but I think I'd have to see the proposed structure before I can promise that's sane.
I think the core of the problem is breaking apart the scheduler code from the code that evaluates the state of toggles. The async stuff should just deal with the scheduler and metrics code. If we can get that separated without majorly impacting the current I'd be very open
@GilbertYoder @philipbjorge
Hey folks, just to answer the question around blocking, this should never block. The polling to the Unleash server is done in a background thread
Unless I'm missing something I imagine the primary use case for an async client would be to call is_enabled and get_variant asynchronously. One could probably get away with synchronous client initialization and I think that would drastically simplify an async-ish implementation.
I'm really trying to avoid this, this opens doors to asynchronous custom strategies and I think that'd be a major anti pattern here.
The primary usecase I can see for async is simply not blocking the current thread when toggles are fetched from the upstream unleash server and not having to deal with multiple threads across process boundaries
@sighphyre To clarify, are you saying you'd like to avoid an async is_enabled
and async get_variant
? I'm a little confused how that would work if you wanted to perform non-blocking IO in event_callback
, for example publish a message.
FWIW, Mongo's async package, Motor, works by wrapping the sync mongo with run_in_executor
Async support based on httpx or aiohttp would be great
thanks all