langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
91.23k stars 14.5k forks source link

OllamaFunctions returning type Error when using with_structured_output #21422

Open SatouKuzuma1 opened 3 months ago

SatouKuzuma1 commented 3 months ago

Checked other resources

Example Code

I'm using this simple code and its returning an error. I cheked the documentation and the example is something similar to what I'm trying to do.


from langchain_experimental.llms.ollama_functions import OllamaFunctions

class RelatedSubjects(BaseModel):
    topics: List[str] = Field(
        description="Comprehensive list of related subjects as background research.",
    )

ollama_functions_llm = OllamaFunctions(model="llama3",format='json')
expand_chain = gen_related_topics_prompt | ollama_functions_llm.with_structured_output(
    RelatedSubjects
)

related_subjects = await expand_chain.ainvoke({"topic": example_topic})
related_subjects

Error Message and Stack Trace (if applicable)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[537], [line 1](vscode-notebook-cell:?execution_count=537&line=1)
----> [1](vscode-notebook-cell:?execution_count=537&line=1) related_subjects = await expand_chain.ainvoke({"topic": example_topic})
      [2](vscode-notebook-cell:?execution_count=537&line=2) related_subjects

File [~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536), in RunnableSequence.ainvoke(self, input, config, **kwargs)
   [2534](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2534) try:
   [2535](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2535)     for i, step in enumerate(self.steps):
-> [2536](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536)         input = await step.ainvoke(
   [2537](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2537)             input,
   [2538](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2538)             # mark each step as a child run
   [2539](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2539)             patch_config(
   [2540](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2540)                 config, callbacks=run_manager.get_child(f"seq:step:{i+1}")
   [2541](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2541)             ),
   [2542](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2542)         )
   [2543](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2543) # finish the root run
   [2544](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2544) except BaseException as e:

File [~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4537](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4537), in RunnableBindingBase.ainvoke(self, input, config, **kwargs)
   [4531](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4531) async def ainvoke(
   [4532](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4532)     self,
   [4533](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4533)     input: Input,
   [4534](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4534)     config: Optional[RunnableConfig] = None,
   [4535](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4535)     **kwargs: Optional[Any],
...
    [179](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:179)     """
--> [180](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:180)     raise TypeError(f'Object of type {o.__class__.__name__} '
    [181](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:181)                     f'is not JSON serializable')

TypeError: Object of type ModelMetaclass is not JSON serializable
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?c5204a57-eead-4ed7-b36f-03dffdb97387) or open in a [text editor](command:workbench.action.openLargeOutput?c5204a57-eead-4ed7-b36f-03dffdb97387). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...

Description

I'm trying to run an example from langgraph using local Ollama. The only way I've found to use with structured output is by using Ollama Functions but it trows an error.

System Info

langchain==0.1.17 langchain-community==0.0.37 langchain-core==0.1.52 langchain-experimental==0.0.58 langchain-groq==0.1.3 langchain-openai==0.1.6 langchain-text-splitters==0.0.1 langchainhub==0.1.15

platform: macOS

dendrebeden commented 3 months ago

Same here. Tried to run code from this notebook and got the same problem with serialization https://github.com/langchain-ai/langgraph/blob/main/examples/reflexion/reflexion.ipynb?ref=blog.langchain.dev


---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[37], line 2
      1 example_question = "Why is reflection useful in AI?"
----> 2 initial = first_responder.respond([HumanMessage(content=example_question)])

Cell In[34], line 30, in ResponderWithRetries.respond(self, state)
     28 response = []
     29 for attempt in range(3):
---> 30     response = self.runnable.invoke(
     31         {"messages": state}, {"tags": [f"attempt:{attempt}"]}
     32     )
     33     try:
     34         self.validator.invoke(response)

...

File /opt/conda/lib/python3.10/json/encoder.py:257, in JSONEncoder.iterencode(self, o, _one_shot)
    252 else:
    253     _iterencode = _make_iterencode(
    254         markers, self.default, _encoder, self.indent, floatstr,
    255         self.key_separator, self.item_separator, self.sort_keys,
    256         self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)

TypeError: Object of type ModelMetaclass is not JSON serializable

Packages: langchain 0.1.20 langchain-anthropic 0.1.11 langchain-core 0.1.52 pydantic 2.7.1 pydantic_core 2.18.2

MacOS as well.

JonZeolla commented 3 months ago

I'm getting this error on macOS but I'm using .batch() not .ainvoke().

langchain==0.2.1 langchain-community==0.2.1 langchain-core==0.2.1 langchain-experimental==0.0.59 langchain-openai==0.1.7 langchain-text-splitters==0.2.0 langsmith==0.1.63 pydantic==2.7.1 pydantic_core==2.18.2

  File "/usr/src/app/abc/llm.py", line 146, in baseline_analysis
    response: ExampleStructuredResponse = chain.batch([prompt_variables])
                                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2537, in batch
    inputs = step.batch(
             ^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 4460, in batch
    return self.bound.batch(
           ^^^^^^^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 631, in batch
    return cast(List[Output], [invoke(inputs[0], configs[0])])
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 627, in invoke
    return self.invoke(input, config, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 170, in invoke
    self.generate_prompt(
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 599, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 456, in generate
    raise e
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 446, in generate
    self._generate_with_cache(
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 671, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "/usr/src/app/.venv/lib/python3.12/site-packages/langchain_experimental/llms/ollama_functions.py", line 304, in _generate
    tools=json.dumps(functions, indent=2)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
  File "/usr/lib/python3.12/json/encoder.py", line 202, in encode
    chunks = list(chunks)
             ^^^^^^^^^^^^
  File "/usr/lib/python3.12/json/encoder.py", line 430, in _iterencode
    yield from _iterencode_list(o, _current_indent_level)
  File "/usr/lib/python3.12/json/encoder.py", line 326, in _iterencode_list
    yield from chunks
  File "/usr/lib/python3.12/json/encoder.py", line 439, in _iterencode
    o = _default(o)
        ^^^^^^^^^^^
  File "/usr/lib/python3.12/json/encoder.py", line 180, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type ModelMetaclass is not JSON serializable
lalanikarim commented 2 months ago

Take a look at #22339 which should have addressed this issue. The PR was approved and merged yesterday but a release is yet to be cut from it and should happen in the next few days.

In the meantime, you may try and install langchain-experimental directly from langchain's source like this:

pip install git+https://github.com/langchain-ai/langchain.git\#egg=langchain-experimental\&subdirectory=libs/experimental

I hope this helps.