andrewnguonly / Lumos

A RAG LLM co-pilot for browsing the web, powered by local LLMs
MIT License
1.34k stars 95 forks source link

TypeError: Cannot read properties of undefined (reading 'includes') #141

Closed xding2 closed 5 months ago

xding2 commented 5 months ago

I would like to get some help based on the following error. :) Basically, how to resolve those errors.

Received prompt (RAG enabled): hi

background.js:2 Received url: https://www.google.com/
background.js:2 Received chunk size: 500 and chunk overlap: 0
background.js:2 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'includes')
    at background.js:2:289844
    at Array.some (<anonymous>)
    at background.js:2:289833
    at background.js:2:289858
    at Generator.next (<anonymous>)
    at s (background.js:2:286137)

background.js:2 Received prompt (RAG disabled): Can you help me?

background.js:2 Uncaught (in promise) Error: Ollama call failed with status code 404: model 'llama2' not found, try pulling it first
    at H (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:180983)
    at async G (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:181533)
    at async X._streamResponseChunks (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:203132)
    at async X._call (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:203559)
    at async Promise.all (index 0)
    at async X._generate (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:197136)
    at async X._generateUncached (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:194595)
    at async X.invoke (chrome-extension://deibpocbmpmjpmbdhlnamknhjimpjdid/js/background.js:2:192759)
xding2 commented 5 months ago

Can anyone help me? I followed the steps and ran docker docker run -e OLLAMA_ORIGINS="chrome-extension://*" -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

However, when I right-click on the extension icon and select "Options" to access the extension's "Options Page", I am unable to select any model, as shown in the following screenshot. [

Screen Shot 2024-03-19 at 1 22 10 PM

](url)

andrewnguonly commented 5 months ago

@xding2, which version of Lumos do you have installed? If you haven't installed the latest (1.0.11), please try that version.

background.js:2 Uncaught (in promise) Error: Ollama call failed with status code 404: model 'llama2' not found, try pulling it first

This error indicates that the llama2 model has not been pulled or there are no models available (no models have been pulled). You can verify that you've pulled at least 1 model by running the following command in the Docker container: ollama list.

To pull a model, run ollama pull <model_name>. Example: ollama pull llama2. Documentation: https://github.com/ollama/ollama?tab=readme-ov-file#pull-a-model

I will update the README to be more explicit about this step.