run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.91k stars 5.29k forks source link

[Bug]: Chat response returns "Empty Response" when no sources are present #16857

Open kolaente opened 2 weeks ago

kolaente commented 2 weeks ago

Bug Description

Asking the model a question where no answer matches turns up content from the document store returns simply "Empty Response" when no document nodes were returned through the retrieval engine. This is not a retrieval error, I actually want to get the "raw" model response in that case. The index does have documents, I just want to get the raw model response when no documents were found. The bug seems to happen with streaming or non-streaming responses.

The other issues around this topic seem to happen only with other llms, I'm using openai:

Version

0.11.21

Steps to Reproduce

  1. Build an index
  2. Create a chat engine with that index
  3. Run chat_engine.chat()

I'll add a minimal code example later.

Relevant Logs/Tracbacks

No response

logan-markewich commented 2 weeks ago

If the retrieval step returns zero documents, it will return empty response yes

That being said, this shouldn't be an issue for most chat engines like the context chat engines, or agents.

kolaente commented 2 weeks ago

Is there a way to configure or disable this?

I'm using the context chat engine.