Closed warpmatrix closed 1 year ago
Docker Desktop and runwasi will support running WASI NN plugins in the next version (hopefully in a month).
Once it is available, you will need to package the Wasm file, PyTorch C library, and the model file in a SCRATCH container. The total image size is around 5MB + model size.
That's great! I'm looking forward to the next version and also the future work of your team. It's a prospective project and thanks for your and your team's contribution.
Btw, as you mentioned that the image needs to package the Pytorch C library, for which I still have an issue about. The installation documentation says that the plugin depends on libtorch-cxx11-abi-shared-with-deps-1.8.2%2Bcpu
. Can I use GPU-enabled libtorch (what I concern more about) or other version libtorch that can be downloaded in pytorch.org?
It is in fact possible to have the PyTorch library installed on the host not in the image. But we feel that people probably would rather NOT installing these on the host.
The way it works is that the runwasi will unpack the files in the image, setup the path, and then hand the Wasm file to WasmEdge. So, I believe the GPU version of PyTorch should work just fine.
That sounds good to me! Thanks again for your patient reply and the great work of your team. And I'll close this issue.
I notice that the wasi-nn only run at host as described in
pytorch-mobilenet-image/README.md
without any description for docker. I wonder whether docker supports for wasi-nn with Pytorch backend. Do I need to install "WASI-NN plugin with PyTorch backend" byRUN
command inDockerfile
, or there will be WasmEdge DockerSlim for wasi-nn with Pytorch backend?