Closed BardiaKh closed 1 day ago
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
thread | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Jun 29, 2024 9:07pm |
Thank you so much for the change, one small piece of feedback:
In the Ollama menu, there should not be a proxy option as everything should already be running locally, that means that if ollama
is the model type, isLocal should be true by default.
I would also make the toggle for Use proxy
to instead be Run API calls locally
. Ideally we can move away from the terminology of a proxy server as some folks found it confusing.
@alishobeiri I think this is now ready for your review.
Thanks a lot for the mention, will take a look now
Just added some small changes as PR to your PR 😄 https://github.com/BardiaKh/thread/pull/2
Great! Merged!
Will start adding the Anthropic support.
Thanks a lot! Really appreciate it, I will make some changes to the modal selection modal to allow users to select from more than ollama and OpenAI
I tried to unify the model setting and server setting modals. Also added a switch for toggling proxy use one and off.
As always, any feedback is appreciated.