derdide / telegram_multi_llm_bot

ChatBot to allow prompting of multiple LLMs in parallel, among other things
MIT License
0 stars 0 forks source link

Conversation history - retrieve past #5

Open derdide opened 1 month ago

derdide commented 1 month ago

As of now, the code is built in a way that it stores the conversation into a DB on the server. This is fine, but useless at this stage. The first thing to do would be to allow the possibility to retrieve a past conversation and display it

derdide commented 3 weeks ago

Some elements: Comparison with the "official" chat systems:

  1. Conversation Context:

    • This Chat: Maintains full context within a single conversation.
    • Your Bot: Does not inherently maintain conversation context between messages. Each command (/gpt, /claude, /compare) is treated as a separate interaction.
  2. State Management:

    • This Chat: Automatically manages state and context for the entire conversation.
    • Your Bot: Uses context.user_data for some state management, primarily for storing the chat mode. However, it doesn't store conversation history.
  3. New Conversations:

    • This Chat: Allows starting a new chat for a fresh context.
    • Your Bot: Each command essentially starts a fresh context, similar to a new chat here.
  4. Persistence:

    • This Chat: Conversation history persists throughout the session.
    • Your Bot: Stores interactions in a database (save_to_database function) but doesn't use this for maintaining context in future interactions.
  5. User Identification:

    • This Chat: Manages separate conversations for each user automatically.
    • Your Bot: Identifies users (via update.effective_user.id) but doesn't use this for maintaining separate ongoing conversations.

To make your bot's conversation handling more similar to this chat system, you could consider the following enhancements:

from collections import defaultdict

# Global conversation store
conversations = defaultdict(list)

async def process_message(update: Update, context: ContextTypes.DEFAULT_TYPE, model_request, model_name, image_content = None):
    chat_id = update.effective_chat.id
    user_id = update.effective_user.id
    user_message = update.message.text
    mode = context.user_data.get('mode')

    # Retrieve conversation history
    conversation_history = conversations[user_id][-5:]  # Last 5 interactions

    # Prepare context for AI model
    full_context = "\n".join([f"Human: {msg['human']}\nAI: {msg['ai']}" for msg in conversation_history])
    full_context += f"\nHuman: {user_message}\nAI:"

    # Get response from AI model with full context
    response = await model_request(full_context, image_content, mode)

    # Update conversation history
    conversations[user_id].append({'human': user_message, 'ai': response})

    # Rest of the function remains the same...

async def clear_context_command(update: Update, context: ContextTypes.DEFAULT_TYPE):
    user_id = update.effective_user.id
    conversations[user_id].clear()
    await update.message.reply_text("Conversation context has been cleared. Starting fresh!")

# Add this to your command handlers
application.add_handler(CommandHandler("clear_context", clear_context_command))

Key changes and additions:

  1. Conversation Storage: A global conversations dictionary to store conversation history for each user.

  2. Context Retrieval: The process_message function now retrieves recent conversation history.

  3. Context Integration: The AI model receives the conversation history along with the new message, allowing for contextual responses.

  4. History Update: After each interaction, the conversation history is updated.

  5. Clear Context Command: A new /clear_context command allows users to start a fresh conversation, similar to starting a new chat in this system.

  6. Limited History: The example limits the context to the last 5 interactions to manage token usage, but you can adjust this as needed.

Implementing these changes would make your bot's conversation handling more similar to this chat system, with some key differences:

These enhancements would significantly improve the bot's ability to maintain context and provide more coherent, ongoing conversations. However, they would also increase complexity and potentially API usage costs.