FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
28.1k stars 14.46k forks source link

[BUG] Ollama Embeddings 500 #1940

Closed 54sww closed 1 day ago

54sww commented 4 months ago

Describe the bug

When using Ollama Embedding, the Ollama server prompts an error of 500, but when I call Ollama through API, it is normal

Ollama through API show info


root@sww-TR2:/opt/milvus# curl http://localhost:11434/api/embeddings -d '{
  "model": "nomic-embed-text",
  "prompt": "The sky is blue because of Rayleigh scattering"
}'
{"embedding":[0.25959235429763794,0.4932633340358734,-2.4133753776550293,-0.4772516191005707,0.6883206367492676,1.4333412647247314,0.06110568344593048,0.4096655547618866,0.1999664306640625,-1.0167186260223389,0.7896386981010437,0.856162965297699,0.7329905033111572,0.8819379210472107,0.33371976017951965,0.3713150918483734,0.1477658897638321,-0.43940073251724243,-0.16702617704868317,-0.2656150460243225,-1.8616790771484375,-0.3986990749835968,0.015317609533667564,-0.7913350462913513,0.9742981791496277,1.326677918434143,-0.5134976506233215,-0.04512714594602585,-0.3449520766735077,-0.3203319311141968,1.0677080154418945,-0.8633328676223755,-0.5432940125465393,-0.8206079602241516,0.6944653987884521,-0.7663739323616028,0.5191969275474548,-0.04510617256164551,-0

Flowise log


docker-flowise-1  | 2024-03-13 03:37:19 [ERROR]: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  | Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  |     at OllamaEmbeddings._request (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/embeddings/ollama.cjs:101:19)
docker-flowise-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
docker-flowise-1  |     at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)
docker-flowise-1  | 2024-03-13 03:37:19 [ERROR]: [server]: Error: Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  | Error: Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  |     at buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:333:19)
docker-flowise-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
docker-flowise-1  |     at async App.buildChatflow (/usr/local/lib/node_modules/flowise/dist/index.js:1708:40)
docker-flowise-1  |     at async /usr/local/lib/node_modules/flowise/dist/index.js:1152:13
docker-flowise-1  | 2024-03-13 04:17:35 [INFO]: ⬆️ POST /api/v1/internal-prediction/bfe14fb8-7341-4448-bed9-06fb5aab0024
docker-flowise-1  | 2024-03-13 04:18:58 [ERROR]: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  | Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  |     at OllamaEmbeddings._request (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/embeddings/ollama.cjs:101:19)
docker-flowise-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
docker-flowise-1  |     at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)
docker-flowise-1  | 2024-03-13 04:18:58 [ERROR]: [server]: Error: Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  | Error: Error: Request to Ollama server failed: 500 Internal Server Error
docker-flowise-1  |     at buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:333:19)
docker-flowise-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
docker-flowise-1  |     at async App.buildChatflow (/usr/local/lib/node_modules/flowise/dist/index.js:1708:40)
docker-flowise-1  |     at async /usr/local/lib/node_modules/flowise/dist/index.js:1152:13
xiaotianfotos commented 4 months ago

https://github.com/ollama/ollama/issues/2707

upgrade ollama to 0.1.29

automaton82 commented 4 months ago

Enable "Use MMap" usually fixes it.

See #1673

54sww commented 4 months ago

https://github.com/ollama/ollama/issues/2707

upgrade ollama to 0.1.29

upgrade ollama to 0.1.29 , but still report 500 errors,For detailed information, please refer to

https://github.com/ollama/ollama/issues/2707

BoyuanGao commented 3 months ago

Same issue happened

felipph commented 2 months ago

Looks like the Ollama API is specting use_mmap as boolean, but a string is provided:

{
    "prompt": "XXXXX",
    "model": "mxbai-embed-large",
    "keep_alive": "5m",
    "options": {
        "use_mmap": ""
    }
}

I tryed to fix but I'm not a JS guy and failled misarable...

HenryHengZJ commented 1 day ago

mmap is now on by default - https://github.com/FlowiseAI/Flowise/pull/2862