henomis / lingoose

🪿 LinGoose is a Go framework for building awesome AI/LLM applications.
https://lingoose.io
MIT License
600 stars 48 forks source link

Possible to use with a local LLM? #152

Closed chrisbward closed 7 months ago

chrisbward commented 9 months ago

As titled, would prefer to use a local LLM instead of OpenAI's GPT. I arrived here via this tutorial/introduction to RAG;

https://simonevellei.com/blog/posts/leveraging-go-and-redis-for-efficient-retrieval-augmented-generation/

henomis commented 9 months ago

I suggest to use localAI and use a custom LLM. Then connect LinGoose to LocalAI using a custom openai client (WithClient( )) with local endpoint

airtonix commented 9 months ago

@henomis Can you comment on why localAI and not Ollama?

nvm, i see. it means you don't have to do any work.

shame, because Ollama presents much nicer development ergonomics, specifically it's similarity to docker:

henomis commented 9 months ago

@airtonix I will check this project and the possibility of integrating it into Lingoose. Thanks for the suggestion.

henomis commented 7 months ago

Ollama will be supported in the next lingoose version.