UdaraJay / Pile

Desktop app for digital journaling.
https://udara.io/pile
MIT License
1.71k stars 94 forks source link

Local LLM support via Ollama #53

Open snibo13 opened 6 months ago

snibo13 commented 6 months ago

I set up a system using Llama 2 via ollama and modified the settings page to allow you to change between Ollama and OpenAI and select the model type you want to use if you're running locally. I'm running on a Windows PC so I haven't been able to validate it on MacOS but I imagine everything should work.

One note is it isn't setup to work with LLMindex as it stands. There is an API endpoint for ollama to generate embeddings for it might be possible to use a different vector database system and explicitly compute the embeddings.

image

snibo13 commented 6 months ago

6

UdaraJay commented 6 months ago

this looks great @snibo13! I'll give it a test and let you know if it's ready.

are you running ollama via wsl on windows?

snibo13 commented 6 months ago

Yeah running with WSL

UdaraJay commented 6 months ago

made a couple changes, going to use this for a bit to see if there's anything else to address, but it's looking good!

https://github.com/UdaraJay/Pile/assets/1122227/36f63e40-30fe-4ef4-8adf-5028fb10661d

leodknuth commented 6 months ago

cool thing. how to? could you offer a tutorial for this? thanks.

0xJeu commented 6 months ago

Any traction on this?

0xJeu commented 5 months ago

Ollama finally produced libraries for implementing their API functions within apps, I hope this is useful. btw something bout the dm license is causing errors.

"Cannot find module 'dmg-license" is the error message

https://github.com/ollama/ollama-js

0xdhrv commented 1 week ago

nudge — gentle

I know everyone might be busy with personal matters, but could you please check if there has been any progress on this?