Closed Adaipix closed 6 months ago
This might be the same issue: https://github.com/ollama/ollama/issues/2736
@Adaipix, can you follow this instruction to confirm? https://github.com/ollama/ollama/issues/2736#issuecomment-1962848427
This might be the same issue: ollama/ollama#2736
@Adaipix, can you follow this instruction to confirm? ollama/ollama#2736 (comment)
@andrewnguonly It works on PowerShell but nothing happens in Lumos.
@andrewnguonly I found the solution by editing the "background.js" file and replacing the occurrences "llama2" with "phi", my model. But it's extremely slow compared to ollama CLI !
@andrewnguonly I found the solution by editing the "background.js" file and replacing the occurrences "llama2" with "phi", my model. But it's extremely slow compared to ollama CLI !
Ok. There might be a bug with retrieving the Lumos options. I need to investigate more. I assume you don't have the llama2
model downloaded at all. I can start from here. If there's any other information you can share about your setup, that would be great!
I'll investigate the "slowness" with phi
separately.
I found the bug and it's fixed in this PR: https://github.com/andrewnguonly/Lumos/pull/127
I'll merge this change ASAP and cut a new release (1.0.11
).
@andrewnguonly Thank you !
Hi. I ran into the same problem when I tried to walk through the quickstart from llamindex. Crashes with the error that the api endpoint/chat url was not found http://localhost:11434/api/chat it also says that 404 is not found. (on http://localhost:11434/ i see that "ollama is running". What am I doing wrong?
@PhPv, which version of Lumos do you have installed and which Ollama model do you have downloaded?
The 404 error occurs if a model is not downloaded.
Hi, Andrew, I install Ollama with Phi2 LLM model under WSL2 on Windows (Ubuntu 22.04.3 LTS). I install Lumos 1.0.10 under Chrome (latest) I start the server with the command *OLLAMA_ORIGINS=chrome-extension:// ollama serve** I type something in Lumos but nothing happens