huggingface / llm-vscode

LLM powered development for VSCode
Apache License 2.0
1.23k stars 133 forks source link

Empty response with custom api #105

Open thanhnew2001 opened 1 year ago

thanhnew2001 commented 1 year ago

I built and install a custom API at https://7b80-103-253-89-37.ngrok-free.app/api/generate

Everything works fine.

image

But when change the endpoint and config template in LLM VSC into this endpoint. Everything just response nothing.

There is no clue to debug. Please help me.

github-actions[bot] commented 11 months ago

This issue is stale because it has been open for 30 days with no activity.

r5r3 commented 10 months ago

I have a similar problem. I'm running Phind/Phind-CodeLlama-34B-v2 locally with TGI. When I trigger the suggestion, I see that a request is send to TGI, and that the response contains the expected [{'generated_text': '..... However, in almost all cases, the suggestion is not shown.

How is the decision made, if a suggestion is shown or not? If it possible to influence the decision?

github-actions[bot] commented 9 months ago

This issue is stale because it has been open for 30 days with no activity.