Open tutacat opened 1 month ago
Posted in that PR:
It's important that
echo "prompt" | llm
continues to work, so I'm not going to accept this change. But... I do agree that havingllm
on its own hang is confusing, so I'll look into fixing this in a way that detects piped input instead.
PR just meant as a draft. The point is it is more robust and simple to change it to run help than implement detection. This is not difficult though, but requires changing the flow of the CLI more.
I think a way to do this is adding a new "detect" function as default to check for non-interactive stdin. Alternatively you could also allow interactive stdin with an info message, though that usecase is better served by the 'chat" TUI.
This is often not great user interface for a CLI. That feels kind of like if
rm
did not run in interactive mode by default on most computers.llm
feels much more likegit
, or a package manager. Rather than be like a one-task tool, such as an interpreter like python, or a compiler.This really comes to a head if you managed to misspell a sub-command for LLM because it will try to run that as an LLM inference, rather than show an error for a mistyped command. This is also the case if somehow you managed to uninstall a plug-in without remembering, because there are not registered keywords unless they are installed. This is also the case if you redirect a pipe into LLM with no arguments.
It feels much better having to distinctly write
llm prompt
for it to run generative text, becausellm
is used for so many other tools.PR is here if you are interested (it would be a very simple change).