Open aud opened 2 weeks ago
There's a workaround for this at the moment:
echo "hello world" | llm
llm chat -c
The -c
means "continue most recent conversation in a chat", which in this case will do the right thing.
Oh sorry no I misunderstood, this looks like a bug:
% echo "hello world" | llm chat
Chatting with gpt-4o-mini
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> Hello! How can I assist you today?
> Aborted!
I think what's happening there may be that the equivalent of a Ctrl+D
is being picked up?
Tried and failed to fix this like so:
diff --git a/llm/cli.py b/llm/cli.py
index ad7aeb4..e06d14c 100644
--- a/llm/cli.py
+++ b/llm/cli.py
@@ -511,14 +511,24 @@ def chat(
if not should_stream:
validated_options["stream"] = False
+ piped_input = get_piped_input()
+
click.echo("Chatting with {}".format(model.model_id))
click.echo("Type 'exit' or 'quit' to exit")
click.echo("Type '!multi' to enter multiple lines, then '!end' to finish")
in_multi = False
accumulated = []
end_token = "!end"
+ first = True
+ prompt = None
while True:
- prompt = click.prompt("", prompt_suffix="> " if not in_multi else "")
+ if first:
+ first = False
+ prompt = piped_input
+ else:
+ prompt = None
+ if not prompt:
+ prompt = click.prompt("", prompt_suffix="> " if not in_multi else "")
if prompt.strip().startswith("!multi"):
in_multi = True
bits = prompt.strip().split()
@@ -550,6 +560,13 @@ def chat(
print("")
+def get_piped_input():
+ """Read initial input pipe if it exists, otherwise return None"""
+ if not sys.stdin.isatty():
+ return sys.stdin.read().strip()
+ return None
+
+
def load_conversation(conversation_id: Optional[str]) -> Optional[Conversation]:
db = sqlite_utils.Database(logs_db_path())
migrate(db)
Would you be open to supporting passing in an optional initial message to chat requests? For eg.
I would expect this to work but seems to crash after the first response:
This is what I'd actually like to do:
files-to-prompt lol.txt --cxml | llm chat