squaredtechnologies / thread

AI-powered Jupyter Notebook — use local AI to generate and edit code cells, automatically fix errors, and chat with your data
https://www.thread.dev
GNU Affero General Public License v3.0
973 stars 49 forks source link
ai analysis analytics data-science jupyter jupyter-notebook jupyter-notebooks jupyterhub jupyterlab ollama python react reactjs

AI-powered Jupyter Notebook

Thread is a Jupyter alternative that integrates an AI copilot into your Jupyter Notebook editing experience.

Best of all, Thread runs locally and can be used for free with Ollama or your own API key. To start:

pip install thread-dev

To start thread-dev, run the following

thread

Key features

1. Familiar Jupyter Notebook editing experience

SameEditorExperience

2. Natural language code edits

CellEditing

3. Generate cells to answer natural language questions

ThreadGenerateMode

4. Ask questions in a context aware chat sidebar

ThreadChatDemo480

5. Automatically explain or fix errors

image

Demo

https://github.com/squaredtechnologies/thread/assets/18422723/b0ef0d7d-bae5-48ad-b293-217b940385fb

ThreadIntro

Feature Roadmap

These are some of the features we are hoping to launch in the next few month. If you have any suggestions or would like to see a feature added, please don't hesitate to open an issue or reach out to us via email or discord.

Thread.dev Cloud

Eventually we hope to integrate Thread into a cloud platform that can support collaboration features as well hosting of notebooks as web application. If this sounds interesting to you, we are looking for enterprise design partners to partner with and customize the solution for. If you're interested, please reach out to us via email or join our waitlist.

Development instructions

To run the repo in development mode, you need to run two terminal commands. One will run Jupyter Server, the other will run the NextJS front end.

To begin, run:

yarn install

Then in one terminal, run:

sh ./run_dev.sh

And in another, run:

yarn dev

Navigate to localhost:3000/thread and you should see your local version of Thread running.

If you would like to develop with the AI features, navigate to the proxy folder and run:

yarn install

Then:

yarn dev --port 5001

Using Thread with Ollama

You can use Ollama for a fully offline AI experience. To begin, install and run thread using the commands above.

Once you have run thread, in the bottom left, select the Settings icon:

image

Next, select the Model Settings setting:

image

This is what you will see:

image

Navigate to Ollama and enter your model details:

image

Use Ctrl / Cmd + K and try running a query to see how it looks!

Why we built Thread

We initially got the idea when building Vizly a tool that lets non-technical users ask questions from their data. While Vizly is powerful at performing data transformations, as engineers, we often felt that natural language didn't give us enough freedom to edit the code that was generated or to explore the data further for ourselves. That is what gave us the inspiration to start Thread.