jawa0 / aish3

AI LLM Agent with memories
Apache License 2.0
3 stars 1 forks source link

Create user-facing way to get chat messages for full conversation out of an LLMChatContainer control #47

Open jawa0 opened 1 year ago

jawa0 commented 1 year ago

Let's say you've been having a chat and it's available in an LLMChatContainer. The user has no easy way to copy all of the text of this chat somewhere else. They would have to copy each TextArea for each turn of the conversation. Provide a way for users to get the full conversation text. Either, or both:

  1. in a JSON object in OpenAI's "messages" format.
  2. as just plain text with separators between system, user, and agent messages.