huggingface / llm-ls

LSP server leveraging LLMs for code completion (and more?)
Apache License 2.0
602 stars 49 forks source link

Unauthenticated warning should not show when a custom backend is used #68

Closed spew closed 8 months ago

spew commented 8 months ago

When supplying a custom backend for the model parameter along with no access token (such as a self-hosted TGI), the following message shows:

You are currently unauthenticated and will get rate limited. To reduce rate limiting, login with your API Token and consider subscribing to PRO: https://huggingface.co/pricing#pro"

This should not show when a custom backend is being used.

Relevant code: https://github.com/huggingface/llm-ls/blob/16606e5371a1b0582543f03fd8a2666f7bf2580a/crates/llm-ls/src/main.rs#L616

DanielAdari commented 8 months ago

A hack around this, until this is fixed...

Provide a random token, even if your backend ignores it. The message won't pop-up.

I agree, definitely needs a fix!

HennerM commented 8 months ago

This will be addressed by https://github.com/huggingface/llm-ls/pull/58