squaredtechnologies / thread

AI-powered Jupyter Notebook — use local AI to generate and edit code cells, automatically fix errors, and chat with your data
https://www.thread.dev
GNU Affero General Public License v3.0
973 stars 49 forks source link

Toggling proxy use #28

Closed BardiaKh closed 1 day ago

BardiaKh commented 2 days ago

I tried to unify the model setting and server setting modals. Also added a switch for toggling proxy use one and off.

image

As always, any feedback is appreciated.

vercel[bot] commented 2 days ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
thread ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 29, 2024 9:07pm
alishobeiri commented 2 days ago

Thank you so much for the change, one small piece of feedback:

image

In the Ollama menu, there should not be a proxy option as everything should already be running locally, that means that if ollama is the model type, isLocal should be true by default.

image

I would also make the toggle for Use proxy to instead be Run API calls locally. Ideally we can move away from the terminology of a proxy server as some folks found it confusing.

BardiaKh commented 1 day ago

@alishobeiri I think this is now ready for your review.

alishobeiri commented 1 day ago

Thanks a lot for the mention, will take a look now

alishobeiri commented 1 day ago

Just added some small changes as PR to your PR 😄 https://github.com/BardiaKh/thread/pull/2

alishobeiri commented 1 day ago
image image
BardiaKh commented 1 day ago

Great! Merged!

Will start adding the Anthropic support.

alishobeiri commented 1 day ago

Thanks a lot! Really appreciate it, I will make some changes to the modal selection modal to allow users to select from more than ollama and OpenAI