squaredtechnologies / thread

AI-powered Jupyter Notebook — use local AI to generate and edit code cells, automatically fix errors, and chat with your data
https://www.thread.dev
GNU Affero General Public License v3.0
974 stars 49 forks source link

Allow using a local LLM #1

Closed Fastidious closed 2 weeks ago

Fastidious commented 3 weeks ago

Will it be possible to add the ability to use a locally running solution, like Ollama?

samisahn commented 2 weeks ago

Yes! You will be able to use it with a locally running Ollama model very soon - it's in the works and should be pushed shortly. We'll update here when it's ready!

samisahn commented 2 weeks ago

Ollama support is officially in beta with this commit, and is available in v0.1.9! Here's a demo video:

https://github.com/squaredtechnologies/thread/assets/26368245/e324ce26-195a-4231-832d-98a59f5bb7cf

The new model selector allows you to choose which model to use - either OpenAI or a local Ollama model. By pointing to the Ollama URL and entering the name of the running model, Thread can use AI fully locally!

Sidenote: Local models work well for the chat, but I experienced a bit of trouble with respecting function calls for the code generation / edit tasks. We'll be hard at work getting those up and running. 🤝

Closing this issue out since local LLMs are working - please feel free to give it a try and raise any additional issues!