Open epheterson opened 1 month ago
This seems to be a first-time launch issue, once there is a cached list of models Ollamac selects from the cache, even while Ollama is closed. On subsequent tries the chat works fine after launching Ollama without any manual step.
Great update by the way, this experience is much smoother! 🔥🦙
My first chat looks like this, with no model selected.
Correct flows show a model in the header.
Confirmed bug @epheterson 😂 but it can be solved by clicking the "Try Again" after opening the Ollama.
Thanks, I did click Try Again though! Maybe I clicked it before launching Ollama and it only showed once? Small bug anyway, people can find their way out
I just went through these steps and hit a bug:
There seems to be some edge where Ollamac can be launched with Ollama being closed that doesn't self-heal without manually selecting a model.