ishandhanani / forky

A git-style way of managing LLM chats
2 stars 2 forks source link

Messages get cut off (too small max_tokens?) #2

Open dpaleka opened 4 weeks ago

dpaleka commented 4 weeks ago

Here is an example of what happens:

$ python -m cli.main chat
Enter your message (type 'quit' to exit, '/status' for conversation state, '/fork' to create a fork, '/merge' to merge a fork, '/visualize' to see the conversation tree, '/history' to view full conversation history):
You: give me a long message
Claude: Here is a longer message for you:

The cosmos stretches out before us, an infinite expanse of stars, galax
ishandhanani commented 2 weeks ago

Hey @dpaleka - that is correct. I did not add a flag for controlling max_tokens. If you add that in, I'm happy to merge it in