hoof-ai / hoof

"Just hoof it!" - A spotlight like interface to Ollama
https://hoof.ing
MIT License
56 stars 6 forks source link

Current features #19

Closed Agent-E11 closed 1 year ago

Agent-E11 commented 1 year ago

I am on Windows, running Ollama on WSL. I am able to run the app, but there isn't very much that I can do in the app, and I would like to know how much of that is because I am on Windows, and how much of it is because the features haven't been implemented yet.

Are any of these actual problems? Or have they just not been implemented yet?

Dax911 commented 1 year ago

So running with dev seems to work great. I think it has to do with how rust is running the ollama list command and how it parses that data back to the selector. I have a fix that converts the native OS call to an ollama API call. Will need to push. This should fix your problem.

Agent-E11 commented 1 year ago

By "work great", do you mean that the problems I mentioned are not problems for you? Also, if you are to ask a question without selecting a model, what happens?

Dax911 commented 1 year ago

it sends an invalid request to the server. (Error handling needs to be added) and no this is not a problem on mac in its current state. But it does affect the binaries when built for release. Running dev "works great" on mac and linux yes.

sammcj commented 1 year ago

My 2c re: "Current features" is that we don't really have any yet, it's very early in development and not ready for use yet.

As far as MS Windows support goes while it's not out of the question, I suspect Windows support will be best effort.

Agent-E11 commented 1 year ago

@sammcj That is what I thought (about Windows support). The latest commit that I pulled is coming up with a lot of errors that I don't know how to resolve 🤷

sammcj commented 1 year ago

The software isn’t a usable product yet. It’s not ready for use. It is in early development.