ollama / ollama

Get up and running with Llama 3.1, Mistral, Gemma 2, and other large language models.
https://ollama.com
MIT License
89.29k stars 7k forks source link

why 404? #5294

Open windkwbs opened 2 months ago

windkwbs commented 2 months ago

微信截图_20240626152830

d-kleine commented 2 months ago

Have you tried with http://localhost:11434 like here: https://ollama.com/blog/openai-compatibility

windkwbs commented 2 months ago

It still doesn't work. Does this mean Windows systems can only use POST and not support OPTIONS?

d-kleine commented 2 months ago

No, should work with Windows too (HTTP methods should be OS-agnostic), must be a different problem. Maybe your port is blocked by a firewall or AV. You could try debugging with Postman (there is a OPTIONS method).

d-kleine commented 2 months ago

I just have take a look into my console when doing inference with Ollama, it's only sending POST requests to "api/chat"