run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.42k stars 4.69k forks source link

[Bug]: Encountered exception writing response to history: list index out of range #12890

Open Prem-Nitin opened 2 months ago

Prem-Nitin commented 2 months ago

Bug Description

I am using gemini 1.5 pro for rag application. When I use context chat engine and stream chat, it is triggering the following error. Encountered exception writing response to history: list index out of range

Version

0.10.18

Steps to Reproduce

Use context chat engine with stream chat with gemini 1.5

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 2 months ago

The "list index out of range" error you're encountering is likely due to attempting to access an element in a list that doesn't exist, which in your case, happens when using the context chat engine with stream chat in Gemini 1.5 Pro. This error often occurs if the merged_messages list you're trying to unpack into history and next_msg is empty or contains fewer elements than expected.

To address this, you should ensure that the messages sequence always contains at least two messages before attempting to unpack it. Here's a quick fix you can apply to your chat method:

@llm_chat_callback()
def chat(self, messages: Sequence[ChatMessage], **kwargs: Any) -> ChatResponse:
    merged_messages = merge_neighboring_same_role_messages(messages)
    if len(merged_messages) < 2:
        # Not enough messages to proceed
        raise ValueError("Not enough messages to start a chat session.")
    *history, next_msg = map(chat_message_to_gemini, merged_messages)
    chat = self._model.start_chat(history=history)
    response = chat.send_message(next_msg)
    return chat_from_gemini_response(response)

This modification checks the length of merged_messages and raises a ValueError if there are not enough messages, preventing the "list index out of range" error.

Regarding updates between version 0.10.18 and the latest version 0.10.29, I didn't find specific changes that directly address this issue. Ensure all prerequisites for using the context chat engine and stream chat together in Gemini 1.5 Pro are met, as outlined in the documentation.

If you continue to face issues or have further questions, feel free to ask for more assistance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.