Open varun-g91 opened 3 months ago
Can you attach the ollama logs? Seems to be some networking issue (first guess)
@varun-g91 had the same issue earlier, probably it has to do with ngrok tunneling with ollama version > 1.2.8
can you try replacing:
run_process(['ngrok', 'http', '--log', 'stderr', '11434'])
with:
run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])
and see if it works.
@skt7 I merged your PR! @varun-g91 check if this helps please
@varun-g91 had the same issue earlier, probably it has to do with ngrok tunneling with ollama version > 1.2.8
can you try replacing:
run_process(['ngrok', 'http', '--log', 'stderr', '11434'])
with:
run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])
and see if it works.
okay ill check and let u know, thanks.
@skt7 I merged your PR! @varun-g91 check if this helps please
yes sure.
run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header', 'localhost:11434'])
Thank you so much it actually worked. Now i can run ai models on my laptop.
Error: something went wrong, please see the ollama server logs for details