Open mudler opened 1 year ago
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
Sure, I can help you with that! Here are some possible solutions or steps to take to address the issue:
Sources:
I want to take a look into this :)
@mudler We can reopen it to add other subcommands, or add separated tickets for each subcommand that you want there.
github automation.. maybe better to keep this open to track and make sub-items
By the way, I'm not doing "API calls to a running localAI", I'm adding a command line interface that skips the server process entirely.
gotcha, maybe makes sense to have a local-ai client <> subcommand as well in order to run against apis. I see that helpful especially to load models from galleries
Actually, I see a lot of value on having the API based client separated from the main binary, because compiling the main binary is hard, but compiling a tiny client library + command line that at the end of the day is just a bunch of HTTP requests should be fairly easy.
I totally see people running the LocalAI using docker compose and accessing it from the command line using something like localai-cli
.
right! that could also sit nearby here and have a separate make target or a separate repository.. I have no strong opinion here, I'm fine with both
I think would be quite useful to have subcommands in the CLI that acts as a client to the API allowing to e.g:
Another great good first issue for those that want to start contributing to the project