Closed NightRang3r closed 2 months ago
Hey, @NightRang3r! I'm here to assist you with any bugs, questions, or contributions. Let's tackle this issue together while we wait for a human maintainer.
To resolve the "Unsupported message type: <class 'dict'>" error when building the Ollama component in Langflow, ensure that the metadata
parameter, which is expected to be a dictionary, is correctly handled. The error suggests that a dictionary is being passed where it is not expected.
Here is the relevant part of the code:
llm_params = {
"base_url": self.base_url,
"model": self.model,
"mirostat": mirostat_value,
"format": self.format,
"metadata": self.metadata, # Ensure this is a dictionary
"tags": self.tags.split(",") if self.tags else None,
"mirostat_eta": mirostat_eta,
"mirostat_tau": mirostat_tau,
"num_ctx": self.num_ctx or None,
"num_gpu": self.num_gpu or None,
"num_thread": self.num_thread or None,
"repeat_last_n": self.repeat_last_n or None,
"repeat_penalty": self.repeat_penalty or None,
"temperature": self.temperature or None,
"stop": self.stop_tokens.split(",") if self.stop_tokens else None,
"system": self.system,
"template": self.template,
"tfs_z": self.tfs_z or None,
"timeout": self.timeout or None,
"top_k": self.top_k or None,
"top_p": self.top_p or None,
"verbose": self.verbose,
}
# Remove parameters with None values
llm_params = {k: v for k, v in llm_params.items() if v is not None}
try:
output = ChatOllama(**llm_params) # Ensure ChatOllama can handle the metadata dictionary
except Exception as e:
raise ValueError("Could not initialize Ollama LLM.") from e
Ensure that the metadata
parameter is correctly formatted as a dictionary before passing it to ChatOllama
. If ChatOllama
does not support a dictionary for metadata
, you may need to modify the ChatOllama
class or adjust how metadata
is being used.
Additionally, the metadata
dictionary should have the following structure:
"metadata": {
"display_name": "Metadata",
"info": "Metadata to add to the run trace.",
"advanced": True,
}
Make sure that the ChatOllama
class is designed to handle this structure appropriately [1][2][3][4].
To continue talking to Dosu, mention @dosu.
The same problem!
I am having a similar issue.
I have noticed for me though that it started doing it when adding new Ollama components, but copying and pasting ones that were already working in other workflows works. This happened suddenly just the other day to me while working on a workflow, I didn't shut anything down, restart, update, or anything.
New Ollama components throw an error, old ones still work fine.
Even if I place a broken version of the Ollama component and replace the code with a copy from one of the working ones, its still says the same error.
And I'm not sure if its related or not, but I can't use the Prompt component with the Ollma one either. Throws the error: Error Building Component Error building Component Ollama: Unsupported message type: <class 'dict'>
Edit: Looks like the issue is with the Prompt component, even when connected to an OpenAI model component I get the same error message: Unsupported message type: <class 'dict'>
It seems to be (Partially) working now. I created a Docker image based on Python 3.12, and the Ollama component builds successfully. However, when I add a prompt and connect the node to the Ollama model, I get the same error.
DockerFile:
# docker build -f DockerFile . -t nightrang3r/langflow:1.0.13
# docker run -d --restart always --add-host=host.docker.internal:host-gateway -p 7860:7860 -it --name langflow nightrang3r/langflow:1.0.13
FROM python:3.12.4-slim
RUN apt-get update && apt-get install -y --no-install-recommends build-essential net-tools git nano wget curl iputils-ping
RUN pip install --upgrade pip
RUN pip install langflow==1.0.13
RUN pip install redis
ENV LANGFLOW_HOST=0.0.0.0
ENV DO_NOT_TRACK=true
EXPOSE 7860
CMD ["langflow", "run"]
Detailed Error:
Error building Component Ollama:
Unsupported message type: <class 'dict'>
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 658, in _build_results
result = await loading.get_instance_results(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 60, in get_instance_results
return await build_component(params=custom_params, custom_component=custom_component)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 147, in build_component
build_results, artifacts = await custom_component.build_results()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 140, in build_results
return await self._build_with_tracing()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 130, in _build_with_tracing
_results, _artifacts = await self._build_results()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 158, in _build_results
result = method()
^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 57, in text_response
result = self.get_chat_result(output, stream, input_value, system_message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 152, in get_chat_result
prompt = input_value.load_lc_prompt()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langflow/schema/message.py", line 174, in load_lc_prompt
loaded_prompt = load(self.prompt)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper
return wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 195, in load
return _load(obj)
^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 190, in _load
return reviver(loaded_obj)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 126, in __call__
return cls(**kwargs)
^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 984, in __init__
_convert_to_message(message, template_format) for message in messages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 1456, in _convert_to_message
raise NotImplementedError(f"Unsupported message type: {type(message)}")
NotImplementedError: Unsupported message type: <class 'dict'>
it occur at openai with prompt. it worked at 1.0.7
But the installation with python is normal. Although a lot of errors are reported, at least it can connect to ollama. It seems to be a problem with the docker image.
Hey all.
This is related to a version mismatch. Please follow #3022
I think it's a network problem with the langfow container.
Bug Description
When trying to build the Ollama component I get the following error:
Error Building Component Error building Component Ollama: Unsupported message type: <class 'dict'>
Ollama version 0.3.0 Langflow version: 1.0.13 Python version: 3.12
Reproduction
Detailed Error:
Expected behavior
Component should be built without errors
Who can help?
No response
Operating System
Debian Linux 12
Langflow Version
1.0.13
Python Version
None
Screenshot
Flow File
No response