drewbaumann / AskGPT

GNU General Public License v3.0
69 stars 20 forks source link

Feature request #7

Closed SmokeShine closed 3 months ago

SmokeShine commented 1 year ago

Is it possible to use llama openai compatibility ? https://github.com/ggerganov/llama.cpp/discussions/795

What changes would be required?

Topping1 commented 6 months ago

You can use my fork to use the KoboldCpp API. My fork: https://github.com/Topping1/AskKobold/tree/main KoboldCpp: https://github.com/LostRuins/koboldcpp

drewbaumann commented 6 months ago

Is it possible to use llama openai compatibility ? ggerganov/llama.cpp#795

What changes would be required?

Using @Topping1 's fork may be easiest, but forking and pointing to a server running a spec mimicking openai's isn't too challenging. I could imagine a preference option where one could pick their model or supply an openAI style compatible API endpoint of their own for local LLMs.

eMUQI commented 4 months ago

Using @Topping1 's fork may be easiest, but forking and pointing to a server running a spec mimicking openai's isn't too challenging. I could imagine a preference option where one could pick their model or supply an openAI style compatible API endpoint of their own for local LLMs.

I believe that making both apibase and module configurable options can meet customization needs and make this project more flexible.

drewbaumann commented 4 months ago

@eMUQI Yes this would be great, and not too challenging. Usually all that's needed is to change the base URL to an OpenAI API clone.

If I added this as an option would you rather see this in the config file like the API key or some form of on device configuration?

eMUQI commented 4 months ago

@eMUQI Yes this would be great, and not too challenging. Usually all that's needed is to change the base URL to an OpenAI API clone.

If I added this as an option would you rather see this in the config file like the API key or some form of on device configuration?

@drewbaumann Thank you for considering adding this feature. Personally, being able to configure it through a configuration file is sufficient for me. I just successfully used a third-party OpenAI proxy service with simple code modifications; everything works well. This is a fantastic plugin—thank you! However, for those who aren't very familiar with coding, having the option to configure both apibase and module alongside the API key would be great.

drewbaumann commented 3 months ago

@eMUQI I just created this PR which should address your needs: #12