Closed fosskers closed 2 years ago
Alright the r2d2
approach appears to work when the caller passes a Pool<M>
where M: ManageConnection<Connection = Alpm>
. I'm able to use rayon
's map_with
to auto-clone the Pool
handle as it passes to each thread, then within each thread grab a pre-opened Alpm
from the pool and work as I intended.
That's probably sufficient for my use-case for the moment, but do you still have anything thoughts about the native Send
ing from above?
There are a number of read-only operations I'd like to perform
So as alpm lazy loads, nothing is really read only. A call to .install_size()
or anything will trigger interior mutability.
Alpm isnt send because the callbacks don't have a send bound. That could be added but then would make you unable to rc in callbacks.
In that case I'll move forward with the Connection Pool approach. Do you foresee any problems with having multiple handles to ALPM open at the same time?
Should be fine. But I also don't really see the need for concurrency.
Cool, then this can be closed.
Related to #6 . We've talked in the past about sharing a single
Alpm
handle across threads, and now I've come to a point in my coding where it's finally (seeming) necessary.There are a number of read-only operations I'd like to perform, namely native package lookups, and I'd like to do so concurrently. In my particular case I'm using
rayon
, and as a rule am keeping all usage ofasync
of out this code base:Here
alpm
is passed in by the caller as&'a Alpm
. References are normally fine to send across threads without any otherArc
ing orMutex
ing, and I intend nothingmut
able. Unfortunately this doesn't seem possible at the moment, with the compiler telling me:Wrapping the handle in
Arc
s, etc., doesn't solve it. So:I suppose I could open a handle per thread, but that's really not ideal. I could also using the
r2d2
pooling strategy that I figured out in the original Rust PoC. Although maybe that would fail now for the same reason? I will check.Anyway, let me know your thoughts. Thank you kindly.