run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.9k stars 4.77k forks source link

[Bug]: Input should be a valid dictionary or instance of SentenceSimilarityInputsCheck: received `Context information is below. #12873

Closed asma-10 closed 2 days ago

asma-10 commented 3 months ago

Bug Description

Hello everyone, i am currently using llamaindex to build a RAG with hugging face and this is the code i am using :

Settings.llm = HuggingFaceInferenceAPI(model_name="sentence-transformers/all-MiniLM-L6-v2", api_key=hf_token)
Settings.embed_model = HuggingFaceEmbedding(model_name='BAAI/bge-m3')

if not os.path.exists(persis_dir):
  documents = SimpleDirectoryReader('data').load_data()
  index = VectorStoreIndex.from_documents(documents)
  index.storage_context.persist(persist_dir=persis_dir)
else:
  storage_context = StorageContext.from_defaults(persist_dir=persis_dir)
  index = load_index_from_storage(storage_context=storage_context)

and every time i run this code :

query_engine = index.as_query_engine()
response = query_engine.query(question)
print(response)

i get this error :

BadRequestError: (Request ID: iSyblEXWumFZFNfAZERUT) Bad request: Input should be a valid dictionary or instance of SentenceSimilarityInputsCheck: received `Context information is below.

can someone help me with this ?

Version

0.10.29

Steps to Reproduce

# the previous code

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 3 months ago

The error you're encountering suggests that the input to the query_engine.query() method is not in the expected format. This method expects a string or a QueryBundle object as its input. If you're passing something other than these types, that's likely the cause of the error.

To resolve this issue, ensure that the question variable you're passing to query_engine.query() is either a properly formatted string or a QueryBundle object. If question is already one of these types and you're still facing the issue, the problem might lie in how the query is being processed internally.

Here are a few steps you can take to debug further:

  1. Verify the format of the question variable to ensure it matches what the query_engine.query() method expects.
  2. Check the internal processing of the query in the LlamaIndex library, particularly how queries are generated and handled.
  3. Review any custom logic you have around the generation and submission of queries to ensure it aligns with the expected formats and processes.

If after these checks the issue persists, it might be helpful to look into the llamaindex library's documentation or source code for more insights on how queries should be formatted and processed.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.