mediar-ai / screenpipe

rewind.ai x cursor.com = your AI assistant that has all the context. 24/7 screen & voice recording for the age of super intelligence. get your data ready or be left behind
https://screenpi.pe
MIT License
10.18k stars 604 forks source link

Restore ollama Windows and Linux support #529

Closed Neptune650 closed 1 month ago

Neptune650 commented 1 month ago

/claim #507

This correctly copies the ollama bin in Linux and adds the libs as resources.

vercel[bot] commented 1 month ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
screenpipe ✅ Ready (Inspect) Visit Preview 💬 Add feedback Oct 18, 2024 9:52am
louis030195 commented 1 month ago

@Neptune650 nice work!

maybe @varshith257 you can test this?

my guess to test it:

gh pr checkout 529 # install github cli if not already

# in root folder

cargo build --release
cd screenpipe-app-tauri
bun i
bun scripts/pre_build.js
bun tauri build # or dev

# launch the app, click the start embedded ai and it should show a joke in the toast

basically should be able to start the llm just as on mac:

https://github.com/user-attachments/assets/67cbce0a-d2e7-415b-b21f-11be9abf4b43

and query it ...

and it uses NVIDIA gpu on linux and windows (not cpu)

Neptune650 commented 1 month ago

@Neptune650 nice work!

maybe @varshith257 you can test this?

my guess to test it:

gh pr checkout 529 # install github cli if not already

# in root folder

cargo build --release
cd screenpipe-app-tauri
bun i
bun scripts/pre_build.js
bun tauri build # or dev

# launch the app, click the start embedded ai and it should show a joke in the toast

basically should be able to start the llm just as on mac: Screen.Recording.2024-10-18.at.09.43.13.mov

and query it ...

and it uses NVIDIA gpu on linux and windows (not cpu)

This sounds good to me, @varshith257 please tell me the results of the tests when you're done!

varshith257 commented 1 month ago

The changes working pretty well in the CI environment, But locally with bun scripts/pre_build.js getting issues. Solving them

louis030195 commented 1 month ago

The changes working pretty well in the CI environment, But locally with bun scripts/pre_build.js getting issues. Solving them

any news?

also wondering if any issues with the OLLAMA_ORIGINS=* issue

basically ollama CORS blocks tauri so need to set this variable at runtime in ollama serve/run

louis030195 commented 1 month ago

@Neptune650 do you confirm this properly make the embedded ollama works on linux and windows with CUDA support without pre-installing it? then will merge and i want to build some new AI features / things based on it

also out of curiosity i wonder if it make whisper work with cuda (e.g. if our audio / stt code properly find the CUDA libs and properly load the model on nvidia gpu if available), otherwise might be in context of another PR/issue it's fine

Neptune650 commented 1 month ago

@Neptune650 do you confirm this properly make the embedded ollama works on linux and windows with CUDA support without pre-installing it? then will merge and i want to build some new AI features / things based on it

also out of curiosity i wonder if it make whisper work with cuda (e.g. if our audio / stt code properly find the CUDA libs and properly load the model on nvidia gpu if available), otherwise might be in context of another PR/issue it's fine

I can confirm it did work in my case - but I can't tell you anything about CUDA because I don't have an Nvidia GPU.

Neptune650 commented 1 month ago

@louis030195 So it's probably good to merge.

louis030195 commented 1 month ago

/approve

algora-pbc[bot] commented 1 month ago

@louis030195: The claim has been successfully added to reward-all. You can visit your dashboard to complete the payment.

louis030195 commented 1 month ago

@Neptune650 fyi this fails: https://github.com/mediar-ai/screenpipe/actions/runs/11450567158/job/31858256452

tauri cannot find the files:

failed to bundle project: glob pattern lib\* path not found or didn't match any files.
    Error [tauri_cli_node] failed to bundle project: glob pattern lib\* path not found or didn't match any files.

(ignore the ubuntu error it's just out of storage)

louis030195 commented 1 month ago
image

also the cublas files etc should be in the resources not the ollama folder directly

maybe lib\\ollama

Neptune650 commented 1 month ago
image

also the cublas files etc should be in the resources not the ollama folder directly

maybe lib\\ollama

That's pretty odd, it worked locally for me... maybe something to do with NTFS changing recursivity? Sending a fix.