Open dctfor opened 5 days ago
It's a nice thought, but I run Ollama on another machine so it'd be better to keep it the way it is.
Here my recent comments on the fix for the model with ollama, now will double check about the faulty baseurl that might be ignored if the only issue was the actual model selection
https://github.com/coleam00/bolt.new-any-llm/issues/259#issuecomment-2481266958
now should be good with wathever IP adress and model you want to use
Taking a look at this today
works for me in combination with removing the .env
and .env.local
definition from the .dockeringore
file. thats needed because otherwise ollama gets not called correctly i think. (or you can define it directly in the docker-compose.yaml file)
You identified that *.local in .dockerignore is likely causing some Docker related issues, please see #329 for some details and please feel free to provide feedback there.
Fixes this issue about not being able to use local ollama yet is a hard fix, there is work to be done to allow it to be dynamically chosen the model from the dropdown as it was taken claude and the model is set by now to "llama3.1:8b" and hardcoded the ollama to 127.0.0.1:11434