andrewnguonly / Lumos

A RAG LLM co-pilot for browsing the web, powered by local LLMs
MIT License
1.34k stars 94 forks source link

Simplify Installation #184

Closed stian-fs closed 1 month ago

stian-fs commented 1 month ago

Awesome work!

I find the installation(setting up Ollama, build extension) easy to follow. I'm curious if there's still any room to improve the process, so that it is more non-techie user friendly.

An idea would be using WebLLM instead of Ollama, but I find the experience much worse than Ollama.

stian-fs commented 1 month ago

https://medium.com/@andrewnguonly/local-llm-in-the-browser-powered-by-ollama-236817f335da

explained well in the above post.