vanna-ai / vanna

🤖 Chat with your SQL database 📊. Accurate Text-to-SQL Generation via LLMs using RAG 🔄.
https://vanna.ai/docs/
MIT License
9.39k stars 691 forks source link

How access local LLM without Ollama? #425

Closed mobguang closed 2 months ago

mobguang commented 2 months ago

Describe the bug I cannot find documentation for explain how to access local LLM without Ollama.

To Reproduce I have downloaded CodeLlama-7b-Instruct-hf on my Mac, but there is no example on Vanna.ai documentation to describe how to use this kind of LLM.

Expected behavior May I know is there any way to access this kind of LLM? Thanks in advance.

Error logs/Screenshots If applicable, add logs/screenshots to give more information about the issue.

Desktop (please complete the following information where):

Additional context Add any other context about the problem here.