Open SatouKuzuma1 opened 3 months ago
Same here. Tried to run code from this notebook and got the same problem with serialization https://github.com/langchain-ai/langgraph/blob/main/examples/reflexion/reflexion.ipynb?ref=blog.langchain.dev
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[37], line 2
1 example_question = "Why is reflection useful in AI?"
----> 2 initial = first_responder.respond([HumanMessage(content=example_question)])
Cell In[34], line 30, in ResponderWithRetries.respond(self, state)
28 response = []
29 for attempt in range(3):
---> 30 response = self.runnable.invoke(
31 {"messages": state}, {"tags": [f"attempt:{attempt}"]}
32 )
33 try:
34 self.validator.invoke(response)
...
File /opt/conda/lib/python3.10/json/encoder.py:257, in JSONEncoder.iterencode(self, o, _one_shot)
252 else:
253 _iterencode = _make_iterencode(
254 markers, self.default, _encoder, self.indent, floatstr,
255 self.key_separator, self.item_separator, self.sort_keys,
256 self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)
TypeError: Object of type ModelMetaclass is not JSON serializable
Packages: langchain 0.1.20 langchain-anthropic 0.1.11 langchain-core 0.1.52 pydantic 2.7.1 pydantic_core 2.18.2
MacOS as well.
I'm getting this error on macOS but I'm using .batch()
not .ainvoke()
.
langchain==0.2.1 langchain-community==0.2.1 langchain-core==0.2.1 langchain-experimental==0.0.59 langchain-openai==0.1.7 langchain-text-splitters==0.2.0 langsmith==0.1.63 pydantic==2.7.1 pydantic_core==2.18.2
File "/usr/src/app/abc/llm.py", line 146, in baseline_analysis
response: ExampleStructuredResponse = chain.batch([prompt_variables])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2537, in batch
inputs = step.batch(
^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 4460, in batch
return self.bound.batch(
^^^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 631, in batch
return cast(List[Output], [invoke(inputs[0], configs[0])])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 627, in invoke
return self.invoke(input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 170, in invoke
self.generate_prompt(
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 599, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 456, in generate
raise e
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 446, in generate
self._generate_with_cache(
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 671, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_experimental/llms/ollama_functions.py", line 304, in _generate
tools=json.dumps(functions, indent=2)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/json/__init__.py", line 238, in dumps
**kw).encode(obj)
^^^^^^^^^^^
File "/usr/lib/python3.12/json/encoder.py", line 202, in encode
chunks = list(chunks)
^^^^^^^^^^^^
File "/usr/lib/python3.12/json/encoder.py", line 430, in _iterencode
yield from _iterencode_list(o, _current_indent_level)
File "/usr/lib/python3.12/json/encoder.py", line 326, in _iterencode_list
yield from chunks
File "/usr/lib/python3.12/json/encoder.py", line 439, in _iterencode
o = _default(o)
^^^^^^^^^^^
File "/usr/lib/python3.12/json/encoder.py", line 180, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type ModelMetaclass is not JSON serializable
Take a look at #22339 which should have addressed this issue. The PR was approved and merged yesterday but a release is yet to be cut from it and should happen in the next few days.
In the meantime, you may try and install langchain-experimental
directly from langchain's source like this:
pip install git+https://github.com/langchain-ai/langchain.git\#egg=langchain-experimental\&subdirectory=libs/experimental
I hope this helps.
Checked other resources
Example Code
I'm using this simple code and its returning an error. I cheked the documentation and the example is something similar to what I'm trying to do.
Error Message and Stack Trace (if applicable)
Description
I'm trying to run an example from langgraph using local Ollama. The only way I've found to use with structured output is by using Ollama Functions but it trows an error.
System Info
langchain==0.1.17 langchain-community==0.0.37 langchain-core==0.1.52 langchain-experimental==0.0.58 langchain-groq==0.1.3 langchain-openai==0.1.6 langchain-text-splitters==0.0.1 langchainhub==0.1.15
platform: macOS