codelion / optillm

Optimizing inference proxy for LLMs
Apache License 2.0
1.64k stars 130 forks source link

Add cli log flag #48

Closed jovanwongzixi closed 1 month ago

jovanwongzixi commented 1 month ago

Added --log flag in cli arguments to set log level for issue #44

codelion commented 1 month ago

Thanks.