marcogreiveldinger / videos

This is a repository about my youtube videos where you can find code-snippets, links and other tutorials
MIT License
56 stars 17 forks source link

i did all the steps carefully and got my OLLAMA_HOST link as well, but in my terminal when i run the export command with the link provided by jupyter (colab) it gives me an error message: #11

Open varun-g91 opened 3 months ago

varun-g91 commented 3 months ago

Error: something went wrong, please see the ollama server logs for details

marcogreiveldinger commented 3 months ago

Can you attach the ollama logs? Seems to be some networking issue (first guess)

skt7 commented 3 months ago

@varun-g91 had the same issue earlier, probably it has to do with ngrok tunneling with ollama version > 1.2.8

can you try replacing: run_process(['ngrok', 'http', '--log', 'stderr', '11434'])

with: run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])

and see if it works.

marcogreiveldinger commented 3 months ago

@skt7 I merged your PR! @varun-g91 check if this helps please

varun-g91 commented 3 months ago

@varun-g91 had the same issue earlier, probably it has to do with ngrok tunneling with ollama version > 1.2.8

can you try replacing: run_process(['ngrok', 'http', '--log', 'stderr', '11434'])

with: run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])

and see if it works.

okay ill check and let u know, thanks.

varun-g91 commented 3 months ago

@skt7 I merged your PR! @varun-g91 check if this helps please

yes sure.

varun-g91 commented 3 months ago

run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])

Thank you so much it actually worked. Now i can run ai models on my laptop.