pytorch / multipy

torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters in a single C++ process.
Other
174 stars 35 forks source link

Question: is there any multi-threading inference example? #297

Closed KexinFeng closed 1 year ago

KexinFeng commented 1 year ago

Hi I have gone through the test examples but didn't find any multi-threading inference example. Maybe I missed it. Could you point such examples to me?

Or, is the experiment code for Fig.5 in this paper shared anywhere? If so could you point it to me too?

Thanks

PaliC commented 1 year ago

Sorry for the late reply.

We have two examples of multithreading. The first is in our test case https://github.com/pytorch/multipy/blob/19617b95252c97547e17d598c0b2cb03d0f4e936/multipy/runtime/test_deploy.cpp#L163

The second is our benchmarking (though its a bit changed from whats used in the paper) https://github.com/pytorch/multipy/blob/main/multipy/runtime/example/benchmark.cpp