epwalsh / batched-fn

🦀 Rust server plugin for deploying deep learning models with batched prediction
https://crates.io/crates/batched-fn
Apache License 2.0
19 stars 2 forks source link

How to use in an actix server? #16

Closed failable closed 2 years ago

failable commented 2 years ago

Hi, thanks for this library!

I'm trying to use the library to write a demo using rust-bert and actix. When I tried to put the model in batched_fn!, I got error like


error[E0277]: `*mut torch_sys::C_tensor` cannot be shared between threads safely
   --> src/routes.rs:28:25
    |
28  |       let batch_predict = batched_fn! {
    |  _________________________^
29  | |         handler = |batch: Vec<(Tensor, Tensor, Tensor, Tensor, Tensor)>, model: &PredictModel| -> Vec<String> {
30  | |             let output = model.predict(batch.clone());
31  | |             println!("Processed batch {:?} -> {:?}", batch, output);
...   |
40  | |         };
41  | |     };
    | |_____^ `*mut torch_sys::C_tensor` cannot be shared between threads safely
    |
    = help: within `(tch::Tensor, tch::Tensor, tch::Tensor, tch::Tensor, tch::Tensor)`, the trait `Sync` is not implemented for `*mut torch_sys::C_tensor`
    = note: required because it appears within the type `tch::Tensor`
    = note: required because it appears within the type `(tch::Tensor, tch::Tensor, tch::Tensor, tch::Tensor, tch::Tensor)`
note: required by a bound in `BatchedFn`
   --> /Users/user/.asdf/installs/rust/1.59.0/registry/src/github.com-1ecc6299db9ec823/batched-fn-0.2.2/src/lib.rs:227:25
    |
227 |     T: 'static + Send + Sync + std::fmt::Debug,
    |                         ^^^^ required by this bound in `BatchedFn`
    = note: this error originates in the macro `$crate::__batched_fn_internal` (in Nightly builds, run with -Z macro-backtrace for more info)

Need the model be Sync and Send? I know there's a rust-dl-webserver project but I'm not quite understand the mechanism differences between actix and warp as I'm pretty new to Rust. Can you provide an simple actix example or help me understand the usage with batched_fn!? e.g. How does context, config, handler works? Are they all required? Is the code inside context run only once for initialization (like loading the model)? Should one put other fields beside model in context?

Many thanks.

epwalsh commented 2 years ago

Hi @liebkne! See https://docs.rs/batched-fn/latest/batched_fn/#implementation-details, in particular:

When the batched_fn macro is invoked it spawns a new thread where the handler will be ran. Within that thread, every object specified in the context is initialized and then passed by reference to the handler each time it is run.

All three arguments (handler, config, and context) are required, and yes, the code inside context is only run once for initialization.

I hope that answers your question! I haven't used actix recently, but if you do get an example working I'd be happy to share it in this repo.

failable commented 2 years ago

@epwalsh

That's great help! Thank you very much! Following you hints, I spent some time to understand the macro and I think I've understood 90% of it. 😂

I will try to make a working example and once I did I will make a PR.

epwalsh commented 2 years ago

Great! Looking forward to it 🙂

Yevgnen commented 2 years ago

I'm also pretty new to Rust and async things and found a tutorial from tokio-channel. After I read the source code I found the pattern (using channels) of the library is similar.

@epwalsh May I ask a question? Why Mutex is needed in BatchedFn?

epwalsh commented 2 years ago

@Yevgnen, great question! I'm glad you asked because it turns out it's actually not needed since flume::Sender is Sync (https://docs.rs/flume/latest/flume/struct.Sender.html#impl-Sync). https://github.com/epwalsh/batched-fn/pull/22 will remove the Mutex.