Closed mateokladaric closed 1 week ago
Hi! I could definitely look into this. Is this the project you're talking about?
Yes, that's it.
On Sun, Jun 9, 2024 at 12:56 PM Ivan Kwiatkowski @.***> wrote:
Hi! I could definitely look into this. Is this https://github.com/ollama/ollama the project you're talking about?
— Reply to this email directly, view it on GitHub https://github.com/JusticeRage/Gepetto/issues/36#issuecomment-2156442084, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMRCOAGJFSO3JDUFP3NKO4TZGQYHHAVCNFSM6AAAAABJAPHWGKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJWGQ2DEMBYGQ . You are receiving this because you authored the thread.Message ID: @.***>
Most of these projects have their own hosting on localhost, like LM Studio has as well, you could pick the port as well. (not sure if you can with ollama) but they both have like local api endpoints.
Perhaps if I use the OpenAI Proxy you set as a setting to the localhost thing it could function without any alterations.
There we are! Please let me know if this works for you!
Thank you!!!
Wanted to know if it's any hastle to add compatiblity to Ollama so the model can be run locally.
Haven't tried, just wondering if anyone has.