microsoft / DeepSpeed-MII

MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Apache License 2.0
1.76k stars 164 forks source link

Docker in VertexAI #167

Open TahaBinhuraib opened 1 year ago

TahaBinhuraib commented 1 year ago

I'm hosting a flask application that calls mii.query inside a docker container. Every time I send a request, the docker server seems to restart. Any ideas on probable causes?

image
TahaBinhuraib commented 1 year ago

Ok, the problem is that running deepspeed in a gunicorn server does not work but running it on a flask development server works.

TahaBinhuraib commented 1 year ago

Still don't know why, and it's likely not an ideal scenario. But it will do for now.