Open alex-ge123 opened 5 days ago
To resolve the {"detail":"Error running graph: No result found for vertex AzureOpenAIModel-jC1hz"}
error, ensure that all required inputs for the AzureChatOpenAIComponent
are correctly provided. Specifically, check that the model
, azure_endpoint
, azure_deployment
, api_version
, and api_key
fields are correctly set. Here is an example of how to instantiate the AzureChatOpenAIComponent
with the necessary inputs:
from langflow.base.langflow.components.models.AzureOpenAIModel import AzureChatOpenAIComponent
# Example instantiation with required inputs
azure_openai_component = AzureChatOpenAIComponent(
model="gpt-4", # Ensure this is one of the valid models
azure_endpoint="https://example-resource.azure.openai.com/",
azure_deployment="your-deployment-name",
api_version="2023-09-01-preview", # Ensure this is one of the valid API versions
api_key="your-secret-api-key",
temperature=0.7,
max_tokens=100,
stream=False
)
# Call the model_response method to get the output
try:
output = azure_openai_component.model_response()
print(output)
except ValueError as e:
print(f"Error: {e}")
Ensure that the model
, azure_endpoint
, azure_deployment
, api_version
, and api_key
are correctly set to avoid the error. If some conversations are not required, you might need to handle those cases separately to ensure that the component is not invoked without the necessary inputs.
Additionally, if you are working with a graph structure, make sure that all vertices are correctly defined and accessible. The error might occur if a vertex is missing or not properly connected in the graph. Here is a snippet from the base.py
file that shows how vertices are processed and checked:
if not isinstance(inputs.get(INPUT_FIELD_NAME, ""), str):
raise ValueError(f"Invalid input value: {inputs.get(INPUT_FIELD_NAME)}. Expected string")
if inputs:
for vertex_id in self._is_input_vertices:
vertex = self.get_vertex(vertex_id)
if input_components and (
vertex_id not in input_components or vertex.display_name not in input_components
):
continue
elif input_type is not None and input_type != "any" and input_type not in vertex.id.lower():
continue
if vertex is None:
raise ValueError(f"Vertex {vertex_id} not found")
vertex.update_raw_params(inputs, overwrite=True)
for vertex_id in self._has_session_id_vertices:
vertex = self.get_vertex(vertex_id)
if vertex is None:
raise ValueError(f"Vertex {vertex_id} not found")
vertex.update_raw_params({"session_id": session_id})
try:
start_component_id = next(
(vertex_id for vertex_id in self._is_input_vertices if "chat" in vertex_id.lower()), None
)
await self.process(start_component_id=start_component_id)
self.increment_run_count()
except Exception as exc:
logger.exception(exc)
raise ValueError(f"Error running graph: {exc}") from exc
Ensure that all vertices are correctly defined and that the graph is properly constructed to avoid missing vertex errors [1][2].
To continue talking to Dosu, mention @dosu.
When a process has multiple AzureOpenAI conversations and some AzureOpenAI is not required to be executed due to process reasons, "{"detail":"Error running graph:" is displayed when the APIrequest is called. No result found for vertex AzureOpenAIModel-jClhz"}" Is there any good way to solve this