tcsenpai / spacellama

Do What The F*ck You Want To Public License
58 stars 4 forks source link

403 error #1

Closed gauravagnihotristla closed 1 month ago

gauravagnihotristla commented 1 month ago

Running Ollama model locally on my Ubuntu 20.04 machine. I can verify the service is working correctly for localhost and 0.0.0.0

curl http://0.0.0.0:11434/api/generate -d '{ "model": "llama3.2", "prompt": "What is water made of?", "stream" : false }' {"model":"llama3.2","created_at":"2024-10-12T01:19:13.325620726Z","response":"Water is a compound made up of two elements: hydrogen and oxygen. It's composed of one molecule of each element, with the chemical formula H2O.\n\nIn more detail, each molecule of water consists of:\n\n* Two hydrogen atoms (H) bonded to\n* One oxygen atom (O)\n\nThis bond between the hydrogen and oxygen atoms is what gives water its unique properties, such as its ability to dissolve a wide range of substances and support life as we know it.","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,3923,374,3090,1903,315,30,128009,128006,78191,128007,271,29353,374,264,24549,1903,709,315,1403,5540,25,35784,323,24463,13,1102,596,24306,315,832,43030,315,1855,2449,11,449,279,11742,15150,473,17,46,382,644,810,7872,11,1855,43030,315,3090,17610,315,1473,9,9220,35784,33299,320,39,8,70241,311,198,9,3861,24463,19670,320,46,696,2028,11049,1990,279,35784,323,24463,33299,374,1148,6835,3090,1202,5016,6012,11,1778,439,1202,5845,311,73739,264,7029,2134,315,33155,323,1862,2324,439,584,1440,433,13],"total_duration":2155936003,"load_duration":1183158420,"prompt_eval_count":31,"prompt_eval_duration":22426000,"eval_count":96,"eval_duration":947470000}

However the extension returns 403 Forbidden error

Upon further investigation, it appears to be CORS issue I followed this guide to allow OLLAMA_ORIGINS=*

I am aware this is unsafe, hence submitting this issue, may be there is a better work around that I am not aware of?

tcsenpai commented 1 month ago

Unfortunately I had to do it as well. I think this happens only in certain situations like browser extensions as I am usually able to call Ollama locally (and even behind a reverse proxy) without problems.

Closing this because it is an Ollama based issue - I may suggest you to set OLLAMA_ORIGINS to your subnet if you use it locally.

Migelo commented 1 month ago

Just wanted to add that the same happens on macos.

tcsenpai commented 1 month ago

Just wanted to add that the same happens on macos.

On mac is easier. You just have to export OLLAMA_ORIGINS="*" and you should be good. You might want to add it to your .zshrc file.

Migelo commented 1 month ago

That's exactly what I did :)

Migelo commented 1 month ago

And to solve it permanently I did this.

Quoting here for convenience.

@kellerkind84 is this working now? It's covered in the FAQ. As @dims mentioned, just:

1. stop the ollama application;

2. run `launchctl setenv OLLAMA_HOST "0.0.0.0"`

3. restart ollama

4. check in `~/.ollama/logs` to see if "Listening on [::]:11434" is in one of the log files

Make sure you're looking in the last log file for step 4.