Closed spew closed 8 months ago
A hack around this, until this is fixed...
Provide a random token, even if your backend ignores it. The message won't pop-up.
I agree, definitely needs a fix!
This will be addressed by https://github.com/huggingface/llm-ls/pull/58
When supplying a custom backend for the
model
parameter along with no access token (such as a self-hosted TGI), the following message shows:You are currently unauthenticated and will get rate limited. To reduce rate limiting, login with your API Token and consider subscribing to PRO: https://huggingface.co/pricing#pro"
This should not show when a custom backend is being used.
Relevant code: https://github.com/huggingface/llm-ls/blob/16606e5371a1b0582543f03fd8a2666f7bf2580a/crates/llm-ls/src/main.rs#L616