second-state / WasmEdge-WASINN-examples

Apache License 2.0
236 stars 39 forks source link

Does docker support WASI-NN with PyTorch Backend? #20

Closed warpmatrix closed 1 year ago

warpmatrix commented 1 year ago

I notice that the wasi-nn only run at host as described in pytorch-mobilenet-image/README.md without any description for docker. I wonder whether docker supports for wasi-nn with Pytorch backend. Do I need to install "WASI-NN plugin with PyTorch backend" by RUN command in Dockerfile, or there will be WasmEdge DockerSlim for wasi-nn with Pytorch backend?

juntao commented 1 year ago

Docker Desktop and runwasi will support running WASI NN plugins in the next version (hopefully in a month).

Once it is available, you will need to package the Wasm file, PyTorch C library, and the model file in a SCRATCH container. The total image size is around 5MB + model size.

warpmatrix commented 1 year ago

That's great! I'm looking forward to the next version and also the future work of your team. It's a prospective project and thanks for your and your team's contribution.

Btw, as you mentioned that the image needs to package the Pytorch C library, for which I still have an issue about. The installation documentation says that the plugin depends on libtorch-cxx11-abi-shared-with-deps-1.8.2%2Bcpu. Can I use GPU-enabled libtorch (what I concern more about) or other version libtorch that can be downloaded in pytorch.org?

juntao commented 1 year ago

It is in fact possible to have the PyTorch library installed on the host not in the image. But we feel that people probably would rather NOT installing these on the host.

The way it works is that the runwasi will unpack the files in the image, setup the path, and then hand the Wasm file to WasmEdge. So, I believe the GPU version of PyTorch should work just fine.

warpmatrix commented 1 year ago

That sounds good to me! Thanks again for your patient reply and the great work of your team. And I'll close this issue.