Closed Agent-E11 closed 1 year ago
So running with dev seems to work great. I think it has to do with how rust is running the ollama list command and how it parses that data back to the selector. I have a fix that converts the native OS call to an ollama API call. Will need to push. This should fix your problem.
By "work great", do you mean that the problems I mentioned are not problems for you? Also, if you are to ask a question without selecting a model, what happens?
it sends an invalid request to the server. (Error handling needs to be added) and no this is not a problem on mac in its current state. But it does affect the binaries when built for release. Running dev "works great" on mac and linux yes.
My 2c re: "Current features" is that we don't really have any yet, it's very early in development and not ready for use yet.
As far as MS Windows support goes while it's not out of the question, I suspect Windows support will be best effort.
@sammcj That is what I thought (about Windows support). The latest commit that I pulled is coming up with a lot of errors that I don't know how to resolve 🤷
The software isn’t a usable product yet. It’s not ready for use. It is in early development.
I am on Windows, running Ollama on WSL. I am able to run the app, but there isn't very much that I can do in the app, and I would like to know how much of that is because I am on Windows, and how much of it is because the features haven't been implemented yet.
pnpm tauri dev
in the repo (with the Ollama server running in the background`:localhost:1420
)pnpm tauri dev
says "Sending request to http://localhost:11434/api/generate" (good)Are any of these actual problems? Or have they just not been implemented yet?