huggingface / llm-vscode

LLM powered development for VSCode
Apache License 2.0
1.24k stars 133 forks source link

Just getting TabNine - nothing for StableCode #54

Closed nickknyc closed 1 year ago

nickknyc commented 1 year ago

Windows 11, using the visix i get tabnine and command pallet dows not recognize hugging face..

yurqua commented 1 year ago

Looks like the same problem:

I'm trying to activate StableCode-Completion-Alpha-3B-4K on huggingface-vscode I have generated a HF token with a write permission and added it in VS Code extension settings. The bigcode/starcoder model works fine.

On the HF model page I pressed the 🚀 Deploy button, selected Inference API item, copied the https://api-inference.huggingface.co/models/stabilityai/stablecode-completion-alpha-3b-4k API endpoint URL and pasted it into the Hugging Face Code: Model ID Or Endpoint settings field.

Unlike with the default bigcode/starcoder model, the extension doesn't seem to work. I tried to restart extensions host and also tried stabilityai/stablecode-completion-alpha-3b-4k as value with no luck.

When I type in my editor, I see 🔄 Status bar icon next to Hugging Face Code spinning for a split of a second, and then nothing happens.

The OUTPUT panel shows something like:

INPUT to API: (with parameters {"max_new_tokens":60,"temperature":null,"do_sample":false,"top_p":0.95,"stop":["<|endoftext|>"]}) 
def main

What am I doing wrong?

heagandev commented 1 year ago

@yurqua I'm also trying to use this with StableCode. Let us know if you make any progress

xznhj8129 commented 1 year ago

Same here, i install it from .vsix since i'm using codium and i get the tabnine extension which i never asked for

McPatate commented 1 year ago

We're moving away from the tabnine fork in the upcoming release.

Feel free to open another issue if you're still facing difficulties.