andrewnguonly / Lumos

A RAG LLM co-pilot for browsing the web, powered by local LLMs
MIT License
1.36k stars 97 forks source link

Lumos with WSL2 #126

Closed Adaipix closed 6 months ago

Adaipix commented 6 months ago

Hi, Andrew, I install Ollama with Phi2 LLM model under WSL2 on Windows (Ubuntu 22.04.3 LTS). I install Lumos 1.0.10 under Chrome (latest) I start the server with the command *OLLAMA_ORIGINS=chrome-extension:// ollama serve** I type something in Lumos but nothing happens Capture d'écran 2024-03-10 151553 Capture d'écran 2024-03-10 151818 Capture d'écran 2024-03-10 151859 Capture d'écran 2024-03-10 152209

andrewnguonly commented 6 months ago

This might be the same issue: https://github.com/ollama/ollama/issues/2736

@Adaipix, can you follow this instruction to confirm? https://github.com/ollama/ollama/issues/2736#issuecomment-1962848427

Adaipix commented 6 months ago

This might be the same issue: ollama/ollama#2736

@Adaipix, can you follow this instruction to confirm? ollama/ollama#2736 (comment)

@andrewnguonly It works on PowerShell but nothing happens in Lumos. Capture d'écran 2024-03-10 170351

Adaipix commented 6 months ago

@andrewnguonly I found the solution by editing the "background.js" file and replacing the occurrences "llama2" with "phi", my model. But it's extremely slow compared to ollama CLI !

andrewnguonly commented 6 months ago

@andrewnguonly I found the solution by editing the "background.js" file and replacing the occurrences "llama2" with "phi", my model. But it's extremely slow compared to ollama CLI !

Ok. There might be a bug with retrieving the Lumos options. I need to investigate more. I assume you don't have the llama2 model downloaded at all. I can start from here. If there's any other information you can share about your setup, that would be great!

I'll investigate the "slowness" with phi separately.

andrewnguonly commented 6 months ago

I found the bug and it's fixed in this PR: https://github.com/andrewnguonly/Lumos/pull/127

I'll merge this change ASAP and cut a new release (1.0.11).

Adaipix commented 6 months ago

@andrewnguonly Thank you !

PhPv commented 5 months ago

Hi. I ran into the same problem when I tried to walk through the quickstart from llamindex. Crashes with the error that the api endpoint/chat url was not found http://localhost:11434/api/chat it also says that 404 is not found. (on http://localhost:11434/ i see that "ollama is running". What am I doing wrong?

andrewnguonly commented 5 months ago

@PhPv, which version of Lumos do you have installed and which Ollama model do you have downloaded?

The 404 error occurs if a model is not downloaded.