jeffdapaz / VisualChatGPTStudio

Add chatGPT functionalities directly on Visual Studio
https://marketplace.visualstudio.com/items?itemName=jefferson-pires.VisualChatGPTStudio
MIT License
186 stars 48 forks source link

Add local LLM support #59

Closed 0Hate closed 6 months ago

0Hate commented 7 months ago

Hi, I am using this open source tool to run local LLM and it has build in api available using same format as the openai api.

Will this work out of the box with the extension?

https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API

If not, I would like to implement this to the extension, is that OK?

jeffdapaz commented 7 months ago

Maybe can work out of the box.

Give a try. On extension options, you'll find there the Base Address parameter.

If you set with your local LLM address maybe works.

Tell me something later the result.

jeffdapaz commented 6 months ago

Hi @0Hate, I will close this issue, but if necessary you can reopen.

redthista commented 5 months ago

Looks like its calling /v1/v1 when requesting rather than just v1 [2024-02-03 13:15:37.244] [ERROR] Unexpected endpoint or method. (POST /v1/v1/chat/completions). Returning 200 anyway

image image