I have very limited expirience with Go but I really want to use yai with local LLM's so I tried to implement support for custom API host and seemingly succeeded.
It is possible because Ollama(open source program that allows to run LLM's locally) recently added Openai API compatibility and now all programs written for OpenAI's API that allow to edit API host can run using local LLM's.
Because of my limited expirience with go and lacking familiarity with this project's codebase I am not sure if I implemented everything correctly strongly advise to review this PR carefully and add anything that I forgot to(if I did) before merging.
I feel like my expirience is not enough to implement custom system prompts without actually learning the language. But people who run local LLM's will appreciate if you do implement custom system prompts because some system prompts that are beneficial for ChatGPT are confusing for smaller and dumber 7b models. If custom system prompts are implemented people will be able to fine tune their prompts for the model that they use.
I have very limited expirience with Go but I really want to use yai with local LLM's so I tried to implement support for custom API host and seemingly succeeded.
It is possible because Ollama(open source program that allows to run LLM's locally) recently added Openai API compatibility and now all programs written for OpenAI's API that allow to edit API host can run using local LLM's.
Because of my limited expirience with go and lacking familiarity with this project's codebase I am not sure if I implemented everything correctly strongly advise to review this PR carefully and add anything that I forgot to(if I did) before merging.
I feel like my expirience is not enough to implement custom system prompts without actually learning the language. But people who run local LLM's will appreciate if you do implement custom system prompts because some system prompts that are beneficial for ChatGPT are confusing for smaller and dumber 7b models. If custom system prompts are implemented people will be able to fine tune their prompts for the model that they use.