Open yanyi74 opened 1 month ago
The ValidationError
indicating that none is not an allowed value
for template
suggests that a None
value is being passed where it is not permitted. This issue typically arises in scenarios where the KnowledgeGraphQueryEngine
or its methods are expecting certain parameters to be provided, and those parameters are either missing or explicitly set to None
.
To address this issue, follow these steps:
Verify Constructor Arguments: Ensure the KnowledgeGraphQueryEngine
is instantiated with all required arguments, especially storage_context
, and that storage_context
includes a valid graph_store
.
Check Method Calls: Make sure that the arguments passed to methods like generate_query
are not None
, unless allowed by the method's documentation.
Optional Parameters: If using optional parameters, confirm they are correctly instantiated or handled when set to None
.
Debugging: Implement logging within the KnowledgeGraphQueryEngine
methods to trace the values of parameters at runtime, helping to pinpoint where a None
value might be introduced.
Review Code for Dynamic Updates: If your implementation involves dynamically updating prompts or similar features, ensure these updates do not result in None
values being passed where they are not expected.
By systematically checking these areas, you should be able to identify and correct the cause of the ValidationError
. If the issue persists, creating a minimal reproducible example could further help in isolating and addressing the problem.
To continue the conversation, mention @dosu.
I have a similar issue, here is the code:
import os import openai from llama_index.llms.openai import OpenAI from llama_index.embeddings.openai import OpenAIEmbedding from llama_index.core.query_engine import KnowledgeGraphQueryEngine from llama_index.graph_stores.neo4j import Neo4jGraphStore from llama_index.core import StorageContext
from llama_index.core import Settings
os.environ["OPENAI_API_KEY"] = "sk-proj-mykey" openai.api_key = os.environ["OPENAI_API_KEY"]
llm = OpenAI(temperature=0, model="gpt-3.5-turbo") Settings.llm = llm Settings.chunk_size = 512
graph_store = Neo4jGraphStore(username="neo4j",password="mypassword", url="bolt://localhost:7687",database="neo4j")
storage_context = StorageContext.from_defaults(graph_store=graph_store)
query_engine = KnowledgeGraphQueryEngine( storage_context=storage_context, llm=Settings.llm, verbose=True, )
response = query_engine.query( "what is the query?", )
print(response)
I have the same problem.
I have the same problem.
remove "llm=Settings.llm", only: graph_rag_retriever = KnowledgeGraphRAGRetriever( storage_context=storage_context, verbose=True, )
Same issues here.
I have the same problem.
remove "llm=Settings.llm", only: graph_rag_retriever = KnowledgeGraphRAGRetriever( storage_context=storage_context, verbose=True, )
We are using KnowledgeGraphQueryEngine rather than KnowledgeGraphRAGRetriever
Bug Description
ValidationError: 1 validation error for LLMPredictStartEvent template none is not an allowed value (type=type_error.none.not_allowed)
Version
0.10.37
Steps to Reproduce
Relevant Logs/Tracbacks
No response