simonw / llm

Access large language models from the command-line
https://llm.datasette.io
Apache License 2.0
4.81k stars 266 forks source link

`llm chat` errors on followup since 0.6 #601

Closed yorickvP closed 2 weeks ago

yorickvP commented 3 weeks ago
$ llm chat
> hello
how are you?
> not much, how are you?
Traceback (most recent call last):
  File "/nix/store/bhbq5lpdd2ncpn7b12w8d5sl6x9xm4gy-python3.12-llm-0.17/bin/.llm-wrapped", line 9, in <module>
    sys.exit(cli())
             ^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm/cli.py", line 535, in chat
    for chunk in response:
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm/models.py", line 169, in __iter__
    for chunk in self.model.execute(
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm_claude_3.py", line 159, in execute
    "messages": self.build_messages(prompt, conversation),
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/c2d99rarg0lj5lacxghi4xwxfas6w8vf-python3-3.12.4-env/lib/python3.12/site-packages/llm_claude_3.py", line 110, in build_messages
    if response.attachments:
       ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'Response' object has no attribute 'attachments'
simonw commented 2 weeks ago

Thanks - this is a bug in LLM core, moving it there.

simonw commented 2 weeks ago

This happens with Claude 3.5 Sonnet and GPT-4o, so I think it's a bug in LLM.

simonw commented 2 weeks ago

I'm not sure why this test didn't catch this:

https://github.com/simonw/llm/blob/122265a3d2131a7140911c2a3d99ad045d61c847/tests/test_chat.py#L20-L31

I've manually tested the fix and confirmed that it works. I'll need to rethink this when I add attachment support to llm chat directly.

simonw commented 2 weeks ago

Manual testing:

% llm chat -m gpt-4o-mini
Chatting with gpt-4o-mini
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> A short pun about pelicans
Why donโ€™t pelicans ever get invited to parties? Because they always bring their own bag!
> banana slugs
What do banana slugs wear to impress each other? Slug-phy attire! ๐ŸŒ๐ŸŒ

And llm logs -c now returns:

2024-11-01T21:18:36 conversation: 01jbmsg3f181dntajvanx0ep5g

Model: gpt-4o-mini

Prompt:

A short pun about pelicans

Response:

Why donโ€™t pelicans ever get invited to parties? Because they always bring their own bag!

2024-11-01T21:18:41

Prompt:

banana slugs

Response:

What do banana slugs wear to impress each other? Slug-phy attire! ๐ŸŒ๐ŸŒ

maxwelljoslyn commented 2 weeks ago

This seems to be a duplicate of #597 so I'll close that too -- glad to see it fixed, thanks Simon!