I mean instead of opening a terminal window running ollama model make a window copilot like ui with which one can directly interact with the llm models from the tray like a messaging app.
BTW I can call in memory that there a similar approach in cosmic. I am meaning something like that...
I mean instead of opening a terminal window running ollama model make a window copilot like ui with which one can directly interact with the llm models from the tray like a messaging app. BTW I can call in memory that there a similar approach in cosmic. I am meaning something like that...