pytorch / multipy

torch::deploy (multipy for non-torch uses) is a system that lets you get around the GIL problem by running multiple Python interpreters in a single C++ process.
Other
173 stars 35 forks source link

PyTorch 2.0 compile mode examples #311

Open tanyokwok opened 1 year ago

tanyokwok commented 1 year ago

Can this repo work with TorchInductor in PyTorch 2.0? Will someone provide any examples of deploying with torch::multipy in PyTorch 2.0 compile mode? cc @PaliC

PaliC commented 1 year ago

Sorry for the late reply,

In theory yes, though we need to make a few tweaks to get it to work such as letting torch inductors compiler link to torch deploy and making package (if used) play nicely with dynamo.

eval-dev commented 10 months ago

@PaliC Hi, how is PT2 support today?