microsoft / DeepSpeed-MII

MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Apache License 2.0
1.91k stars 175 forks source link

Configure server log level #495

Open sedletsky-f5 opened 5 months ago

sedletsky-f5 commented 5 months ago

Please add one or more params to control logs from RESTful API server - namely in mii.serve() function. You can see as reference -log- config params in vLLM: https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#command-line-arguments-for-the-server

It is especially important at startup, when there is a huge amount of logs (from several processes).

loadams commented 4 months ago

Hi @sedletsky-f5 - we will try to prioritize this, in the meantime, if you are able or willing, would you consider making a PR and we can review and get it merged?

sedletsky-f5 commented 4 months ago

thanks @loadams , I will wait for you implementation (I didn't get into the code to make this change)