Open nongrata081 opened 1 week ago
Came to ask the same thing!
Native Ollama support would be amazing.
Hey @nongrata081 and @sammcj!
This is something we're considering adding. I've added this to our internal tracker. We'll keep you posted!
@Nemikolh awesome! waiting for a new release!
tagging myself as well
i also need that !
also tagging this issue for updates.
I also need ollama support. Thank you!
This would be great features and use case. Please add it at earliest.
tagging me
Is your feature request related to a problem? Please describe:
Would be nice to be able to run bolt.new locally with local language models. Is that something bolt.new team might consider to implement?
Describe the solution you'd like: Use local Language Models (e.g. via Ollama)
Describe alternatives you've considered:
Additional context: