pytorch / serve

Serve, optimize and scale PyTorch models in production
https://pytorch.org/serve/
Apache License 2.0
4.22k stars 860 forks source link

Simultaneously serving models with conflicting dependencies #1011

Open urmeya opened 3 years ago

urmeya commented 3 years ago

I am new to using torchserve for model deployment. I have tried using torchserve to serve multiple models e.g. model1.mar and model2.mar simultaneously as following:

torchserve --start --ncs --model-store model_store/ --models model1.mar model2.mar

I was able to serve multiple such models simultaneously provided dependencies for all these models are satisfied in the environment from which torchserve is running.

I wish to know whether there is a way to simultaneously serve models with conflicting dependencies e.g. if model1 and model2 required different versions of a same library 'transformers`. Ideally can we package the environment itself in the MAR file or is there in other way this problem can be solved?

dhanainme commented 3 years ago

Torchserve does provide environment isolation (venv / similar) right now. This is a valid feature ask which can be implemented in the future versions.

csaroff commented 3 years ago

@dhanainme I saw that the model archiver allows you to include a requirements.txt file. How does that differ from the environment isolation that you're describing?

urmeya commented 1 year ago

As I understand this feature is still in backlog. Is there any work around we can use to allow different model pipelines to use different environments?

Does the requirements file method mentioned by @csaroff install the packages in the same environment as torchserve or can it create different environments for different MAR files?