Open wenlong1234 opened 6 months ago
specify the API base in the ui, should be the same as GEN_AI_API_ENDPOINT. so try http://host.docker.internal:11434
thanks!GEN_AI_API_ENDPOINT. so try http://host.docker.internal:11434/ ,that is right
I'm having the same issue, that Endpoint (http://host.docker.internal:11434/) does not reach the Ollama in Windows 11 Home In Danswer you will see a 404 page not found error message Instead if you use http://docker.for.win.localhost:11434/ in the browser you will get And an entry in the Ollama server log [GIN] 2024/05/15 - 07:36:44 | 200 | 227.4µs | 127.0.0.1 | GET "/" Unfortunately that does not work yet inside Danswer and you will get the same error: 'NoneType' object has no attribute 'request' Hope that helps somehow
As an additional test, I did a curl test querying the llama2 model and it worked
C:\>curl -X POST -H "Content-type: application/json" --data "{\"model\": \"llama2\", \"prompt\": \"3*127?\"}" http://docker.for.win.localhost:11434/api/generate
{"model":"llama2","created_at":"2024-05-20T20:28:20.8647932Z","response":"\n","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:20.9152719Z","response":" multip","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:20.9647467Z","response":"lying","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.0157082Z","response":" ","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.0663006Z","response":"3","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.116693Z","response":" by","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.1688001Z","response":" ","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.2184965Z","response":"1","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.263098Z","response":"2","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.3061472Z","response":"7","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.3538813Z","response":" gives","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.4051607Z","response":" us","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.4486889Z","response":":","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.492479Z","response":"\n","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.5373874Z","response":"\n","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.582978Z","response":"3","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.6263964Z","response":" x","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.6715274Z","response":" ","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.7147195Z","response":"1","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.7593716Z","response":"2","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.802122Z","response":"7","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.8443953Z","response":" =","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.8869676Z","response":" ","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.9308388Z","response":"3","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:21.9740217Z","response":"8","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:22.0167204Z","response":"1","done":false}
{"model":"llama2","created_at":"2024-05-20T20:28:22.0601185Z","response":"","done":true,"done_reason":"stop","context":[518,25580,29962,3532,14816,29903,29958,5299,829,14816,29903,6778,13,13,29941,29930,29896,29906,29955,29973,518,29914,25580,29962,13,13,6674,5890,29871,29941,491,29871,29896,29906,29955,4076,502,29901,13,13,29941,921,29871,29896,29906,29955,353,29871,29941,29947,29896],"total_duration":1598713300,"load_duration":5054400,"prompt_eval_duration":398017000,"eval_count":27,"eval_duration":1194464000}
C:\>
Seems that a configuration was needed in docker to make it work After that and using http://host.docker.internal:11434 I was able to connect
Seems that a configuration was needed in docker to make it work After that and using http://host.docker.internal:11434 I was able to connect
Hi, did you just click the checkmark and it worked? or do I have to somehow start the danswer container with --net=host, if yes can you share the full command, I cant figure it out.
In my case clicking that option was enough
I install llama2 and 3 through ollama in windows,danswer is also installed in windows, i can not use local llama2 or 3,something wrong wih my environment?