Closed flefevre closed 3 days ago
The error you're encountering, "Edge between Prompt and Ollama has invalid handles," suggests that there is a compatibility issue between the input and output types of the connected nodes in your flow. The validate_handles
method in the Edge
class is responsible for ensuring that the data types between connected nodes are compatible. If they are not, it raises a ValueError
[1].
To resolve this issue, you should check the input and output types of the 'Prompt' and 'OllamaModel' nodes to ensure they are compatible. The 'Prompt' node has an input type of Text
named "template" and an output type of Text
[2]. The 'OllamaModel' node has various input types, including MessageInput
for input_value
, and outputs such as text_output
and model_output
[3].
Ensure that the output from the 'Prompt' node is correctly connected to a compatible input on the 'OllamaModel' node. If the types do not match, you may need to adjust the flow or use a node that can convert or adapt the data types accordingly.
To continue talking to Dosu, mention @dosu.
It is the most simple workflow and it doesn't not work. It is very frustrating. It seems more related to a front end problem, do you agree @lucaseduoli
it has been solved in the nighlty build by https://github.com/langflow-ai/langflow/issues/4398#issuecomment-2490201406
Bug Description
When creating a basic Ollama flow from the basic template, i have no answer. When clicking on Playground, i have "Flow running..." ! My Ollama is working, and accessible and checked with Openwebui front end
looking to the langflow log
...
Reproduction
Just click on Playground
Expected behavior
The playground should be accessible
Who can help?
No response
Operating System
Ubuntu 22.04
Langflow Version
1.1.0.dev5
Python Version
None
Screenshot
Flow File
Basic Prompting.json