JusticeRage / Gepetto

IDA plugin which queries uses language models to speed up reverse-engineering
GNU General Public License v3.0
2.81k stars 260 forks source link

Ollama for Local Models #36

Closed mateokladaric closed 1 week ago

mateokladaric commented 3 months ago

Wanted to know if it's any hastle to add compatiblity to Ollama so the model can be run locally.

Haven't tried, just wondering if anyone has.

JusticeRage commented 3 months ago

Hi! I could definitely look into this. Is this the project you're talking about?

mateokladaric commented 3 months ago

Yes, that's it.

On Sun, Jun 9, 2024 at 12:56 PM Ivan Kwiatkowski @.***> wrote:

Hi! I could definitely look into this. Is this https://github.com/ollama/ollama the project you're talking about?

— Reply to this email directly, view it on GitHub https://github.com/JusticeRage/Gepetto/issues/36#issuecomment-2156442084, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMRCOAGJFSO3JDUFP3NKO4TZGQYHHAVCNFSM6AAAAABJAPHWGKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJWGQ2DEMBYGQ . You are receiving this because you authored the thread.Message ID: @.***>

mateokladaric commented 3 months ago

Most of these projects have their own hosting on localhost, like LM Studio has as well, you could pick the port as well. (not sure if you can with ollama) but they both have like local api endpoints.

Perhaps if I use the OpenAI Proxy you set as a setting to the localhost thing it could function without any alterations.

JusticeRage commented 1 week ago

There we are! Please let me know if this works for you!

mateokladaric commented 1 week ago

Thank you!!!