pytorch / serve

Serve, optimize and scale PyTorch models in production
https://pytorch.org/serve/
Apache License 2.0
4.23k stars 864 forks source link

fsspec for batch inference example #1891

Closed msaroufim closed 2 years ago

msaroufim commented 2 years ago

🚀 The feature

fsspec gives the ability to work with remote file systems like S3, GCS or Azure Blog storage with a single unified API

See this recent example for how broadly useful it is https://github.com/pytorch/data/pull/812

We can create an example with some large dataset in some remote cloud storage, stream that data to some local client which then creates requests to torchserve for some larger scale batch inference with a large batch size

Motivation, pitch

See above

Alternatives

No response

Additional context

No response

kirkpa commented 2 years ago

Hi @msaroufim, Can you add me as a collaborator/contributor to pytorch/serve project. I would like to submit PR for this issue. Thanks

msaroufim commented 2 years ago

Hi @kirkpa there is no need for that you can can create your own fork of pytorch/serve and then make a PR to this repo