langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.71k stars 14.85k forks source link

ChatOpenAI broken after updating langchain #26832

Open openSourcerer9000 opened 17 hours ago

openSourcerer9000 commented 17 hours ago

Checked other resources

Example Code

from langchain_openai import ChatOpenAI

Error Message and Stack Trace (if applicable)

---> [12](vscode-notebook-cell:?execution_count=1&line=12) from langchain_openai import ChatOpenAI
     [13](vscode-notebook-cell:?execution_count=1&line=13) assert load_dotenv()

File c:\Users\seanrm100\anaconda3\envs\fuuze\Lib\site-packages\langchain_openai\__init__.py:1
----> [1](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/__init__.py:1) from langchain_openai.chat_models import AzureChatOpenAI, ChatOpenAI
      [2](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/__init__.py:2) from langchain_openai.embeddings import AzureOpenAIEmbeddings, OpenAIEmbeddings
      [3](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/__init__.py:3) from langchain_openai.llms import AzureOpenAI, OpenAI

File c:\Users\seanrm100\anaconda3\envs\fuuze\Lib\site-packages\langchain_openai\chat_models\__init__.py:[1](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/chat_models/__init__.py:1)
----> 1 from langchain_openai.chat_models.azure import AzureChatOpenAI
      [2](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/chat_models/__init__.py:2) from langchain_openai.chat_models.base import ChatOpenAI
      [4](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/chat_models/__init__.py:4) __all__ = ["ChatOpenAI", "AzureChatOpenAI"]

File c:\Users\seanrm100\anaconda3\envs\fuuze\Lib\site-packages\langchain_openai\chat_models\azure.py:24
      [8](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/chat_models/azure.py:8) from typing import (
      [9](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/langchain_openai/chat_models/azure.py:9)     Any,
...
   [1530](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/pydantic/_internal/_generate_schema.py:1530) )
   [1532](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/pydantic/_internal/_generate_schema.py:1532) schema = self._apply_model_serializers(td_schema, decorators.model_serializers.values())
   [1533](file:///C:/Users/seanrm100/anaconda3/envs/fuuze/Lib/site-packages/pydantic/_internal/_generate_schema.py:1533) schema = apply_model_validators(schema, decorators.model_validators.values(), 'all')

TypeError: typed_dict_schema() got an unexpected keyword argument 'cls'

Description

Trying to fix another incompatibility issue with the disparate langchain libraries, I upgraded langchain-core, langchain- community and langchain-openai.

It seems the most recent configuration of latest packages is broken. Can someone save me from this dependency hell?

I would suggest at least having a working version of langchain wrapped as a single library with a single install command. the wrapper library could nearly have dependency versions specified and simple tests to make sure the examples from the documentation work. Having to wrestle with different versions of your different codebases should be handled just once by the maintainers, rather than each and every user every time something changes.

System Info

System Information

OS: Windows OS Version: 10.0.19045 Python Version: 3.11.8 | packaged by conda-forge | (main, Feb 16 2024, 20:40:50) [MSC v.1937 64 bit (AMD64)]

Package Information

langchain_core: 0.3.5 langchain: 0.3.0 langchain_community: 0.3.0 langsmith: 0.1.127 langchain_cli: 0.0.29 langchain_ollama: 0.1.1 langchain_openai: 0.2.0 langchain_text_splitters: 0.3.0 langgraph: 0.2.3 langserve: 0.2.2

Other Dependencies

aiohttp: 3.10.1 async-timeout: 4.0.3 dataclasses-json: 0.6.7 fastapi: 0.112.0 gitpython: 3.1.43 httpx: 0.27.0 jsonpatch: 1.33 langgraph-checkpoint: 1.0.11 langserve[all]: Installed. No version info available. libcst: 1.4.0 numpy: 1.26.4 ollama: 0.3.1 openai: 1.40.1 orjson: 3.10.6 packaging: 24.1 pydantic: 2.8.2 pydantic-settings: 2.5.2 pyproject-toml: 0.0.10 PyYAML: 6.0.1 requests: 2.31.0 SQLAlchemy: 2.0.32 sse-starlette: 1.8.2 tenacity: 8.5.0 tiktoken: 0.7.0 tomlkit: 0.12.5 typer[all]: Installed. No version info available. typing-extensions: 4.11.0 uvicorn: 0.23.2

efriis commented 17 hours ago

Hey there! Did you try restarting the kernel? I can't reproduce this with

%pip install -U langchain-openai # installs 0.2.0

and then restarting the kernel

and then

from langchain_openai import ChatOpenAI
openSourcerer9000 commented 16 hours ago

Restarted the kernel again. You can see my versions in the first post. Full stack trace:


{
        "name": "TypeError",
        "message": "typed_dict_schema() got an unexpe
```cted keyword argument 'cls'",
        "stack": "---------------------------------------------------------------------------
TypeError                                 Traceback (most r ecent call last)
Cell In[1], line 1
----> 1 from langchain_openai import ChatOpenAI

File ...\Lib\site-packages\langchain_openai\__init__.py:1
----> 1 from langchain_openai.chat_models import AzureChatOpenAI, ChatOpenAI
      2 from langchain_openai.embeddings import AzureOpenAIEmbeddings, OpenAIEmbeddings
      3 from langchain_openai.llms import AzureOpenAI, OpenAI

File ...\Lib\site-packages\langchain_openai\chat_models\__init__.py:1
----> 1 from langchain_openai.chat_models.azure import AzureChatOpenAI
      2 from langchain_openai.chat_models.base import ChatOpenAI
      4 __all__ = ["ChatOpenAI", "AzureChatOpenAI"]

File ...\Lib\site-packages\langchain_openai\chat_models\azure.py:24  
      8 from typing import (
      9     Any,
     10     Callable,
   (...)
     20     overload,
     21 )
     23 import openai
---> 24 from langchain_core.language_models import LanguageModelInput
     25 from langchain_core.language_models.chat_models import LangSmithParams
     26 from langchain_core.messages import BaseMessage

File ...\Lib\site-packages\langchain_core\language_models\__init__.py:42
      1 """**Language Model** is a type of model that can generate text or complete
      2 text prompts.
      3
   (...)
     39
     40 """  # noqa: E501
---> 42 from langchain_core.language_models.base import (
     43     BaseLanguageModel,
     44     LangSmithParams,
     45     LanguageModelInput,
     46     LanguageModelLike,
     47     LanguageModelOutput,
     48     get_tokenizer,
     49 )
     50 from langchain_core.language_models.chat_models import BaseChatModel, SimpleChatModel
     51 from langchain_core.language_models.fake import FakeListLLM, FakeStreamingListLLM

File ...\Lib\site-packages\langchain_core\language_models\base.py:20
     17 from typing_extensions import TypeAlias, TypedDict, override
     19 from langchain_core._api import deprecated
---> 20 from langchain_core.messages import (
     21     AnyMessage,
     22     BaseMessage,
     23     MessageLikeRepresentation,
     24     get_buffer_string,
     25 )
     26 from langchain_core.prompt_values import PromptValue
     27 from langchain_core.runnables import Runnable, RunnableSerializable

File ...\Lib\site-packages\langchain_core\messages\__init__.py:18
      1 """**Messages** are objects used in prompts and chat conversations.
      2
      3 **Class hierarchy:**
   (...)
     15
     16 """  # noqa: E501
---> 18 from langchain_core.messages.ai import (
     19     AIMessage,
     20     AIMessageChunk,
     21 )
     22 from langchain_core.messages.base import (
     23     BaseMessage,
     24     BaseMessageChunk,
   (...)
     27     messages_to_dict,
     28 )
     29 from langchain_core.messages.chat import ChatMessage, ChatMessageChunk

File ...\Lib\site-packages\langchain_core\messages\ai.py:56
     52     total_tokens: int
     53     """Total token count."""
---> 56 class AIMessage(BaseMessage):
     57     """Message from an AI.
     58
     59     AIMessage is returned from a chat model as a response to a prompt.
   (...)
     63     (e.g., tool calls, usage metadata) added by the LangChain framework.
     64     """
     66     example: bool = False

File ...\Lib\site-packages\pydantic\_internal\_model_construction.py:224, in ModelMetaclass.__new__(mcs, cls_name, bases, namespace, __pydantic_generic_metadata__, __pydantic_reset_parent_namespace__, _create_model_module, **kwargs)
    221 if config_wrapper.frozen and '__hash__' not in namespace:
    222     set_default_hash_func(cls, bases)
--> 224 complete_model_class(
    225     cls,
    226     cls_name,
    227     config_wrapper,
    228     raise_errors=False,
    229     types_namespace=types_namespace,
    230     create_model_module=_create_model_module,
    231 )
    233 # If this is placed before the complete_model_class call above,
    234 # the generic computed fields return type is set to PydanticUndefined
    235 cls.model_computed_fields = {k: v.info for k, v in cls.__pydantic_decorators__.computed_fields.items()}

File ...\Lib\site-packages\pydantic\_internal\_model_construction.py:577, in complete_model_class(cls, cls_name, config_wrapper, raise_errors, types_namespace, create_model_module)
    574     return False
    576 try:
--> 577     schema = cls.__get_pydantic_core_schema__(cls, handler)
    578 except PydanticUndefinedAnnotation as e:
    579     if raise_errors:

File ...\Lib\site-packages\pydantic\main.py:671, in BaseModel.__get_pydantic_core_schema__(cls, source, handler)
    668     if not cls.__pydantic_generic_metadata__['origin']:
    669         return cls.__pydantic_core_schema__
--> 671 return handler(source)

File ...\Lib\site-packages\pydantic\_internal\_schema_generation_shared.py:83, in CallbackGetCoreSchemaHandler.__call__(self, source_type)
     82 def __call__(self, source_type: Any, /) -> core_schema.CoreSchema:
---> 83     schema = self._handler(source_type)
     84     ref = schema.get('ref')
     85     if self._ref_mode == 'to-def':

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:655, in GenerateSchema.generate_schema(self, obj, from_dunder_get_core_schema)
    652         schema = from_property
    654 if schema is None:
--> 655     schema = self._generate_schema_inner(obj)
    657 metadata_js_function = _extract_get_pydantic_json_schema(obj, schema)
    658 if metadata_js_function is not None:

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:924, in GenerateSchema._generate_schema_inner(self, obj)
    922 if lenient_issubclass(obj, BaseModel):
    923     with self.model_type_stack.push(obj):
--> 924         return self._model_schema(obj)
    926 if isinstance(obj, PydanticRecursiveRef):
    927     return core_schema.definition_reference_schema(schema_ref=obj.type_ref)

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:739, in GenerateSchema._model_schema(self, cls)
    727     model_schema = core_schema.model_schema(
    728         cls,
    729         inner_schema,
   (...)
    735         metadata=metadata,
    736     )
    737 else:
    738     fields_schema: core_schema.CoreSchema = core_schema.model_fields_schema(
--> 739         {k: self._generate_md_field_schema(k, v, decorators) for k, v in fields.items()},
    740         computed_fields=[
    741             self._computed_field_schema(d, decorators.field_serializers)
    742             for d in computed_fields.values()
    743         ],
    744         extras_schema=extras_schema,
    745         model_name=cls.__name__,
    746     )
    747     inner_schema = apply_validators(fields_schema, decorators.root_validators.values(), None)
    748     new_inner_schema = define_expected_missing_refs(inner_schema, recursively_defined_type_refs())

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:739, in <dictcomp>(.0)
    727     model_schema = core_schema.model_schema(
    728         cls,
    729         inner_schema,
   (...)
    735         metadata=metadata,
    736     )
    737 else:
    738     fields_schema: core_schema.CoreSchema = core_schema.model_fields_schema(
--> 739         {k: self._generate_md_field_schema(k, v, decorators) for k, v in fields.items()},
    740         computed_fields=[
    741             self._computed_field_schema(d, decorators.field_serializers)
    742             for d in computed_fields.values()
    743         ],
    744         extras_schema=extras_schema,
    745         model_name=cls.__name__,
    746     )
    747     inner_schema = apply_validators(fields_schema, decorators.root_validators.values(), None)
    748     new_inner_schema = define_expected_missing_refs(inner_schema, recursively_defined_type_refs())

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:1115, in GenerateSchema._generate_md_field_schema(self, name, field_info, decorators)
   1108 def _generate_md_field_schema(
   1109     self,
   1110     name: str,
   1111     field_info: FieldInfo,
   1112     decorators: DecoratorInfos,
   1113 ) -> core_schema.ModelField:
   1114     """Prepare a ModelField to represent a model field."""
-> 1115     common_field = self._common_field_schema(name, field_info, decorators)
   1116     return core_schema.model_field(
   1117         common_field['schema'],
   1118         serialization_exclude=common_field['serialization_exclude'],
   (...)
   1122         metadata=common_field['metadata'],
   1123     )

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:1308, in GenerateSchema._common_field_schema(self, name, field_info, decorators)
   1304         schema = self._apply_annotations(
   1305             source_type, annotations + validators_from_decorators, transform_inner_schema=set_discriminator
   1306         )
   1307     else:
-> 1308         schema = self._apply_annotations(
   1309             source_type,
   1310             annotations + validators_from_decorators,
   1311         )
   1313 # This V1 compatibility shim should eventually be removed
   1314 # push down any `each_item=True` validators
   1315 # note that this won't work for any Annotated types that get wrapped by a function validator
   1316 # but that's okay because that didn't exist in V1
   1317 this_field_validators = filter_field_decorator_info_by_field(decorators.validators.values(), name)

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:2107, in GenerateSchema._apply_annotations(self, source_type, annotations, transform_inner_schema)
   2102         continue
   2103     get_inner_schema = self._get_wrapped_inner_schema(
   2104         get_inner_schema, annotation, pydantic_js_annotation_functions
   2105     )
-> 2107 schema = get_inner_schema(source_type)
   2108 if pydantic_js_annotation_functions:
   2109     metadata = CoreMetadataHandler(schema).metadata

File ...\Lib\site-packages\pydantic\_internal\_schema_generation_shared.py:83, in CallbackGetCoreSchemaHandler.__call__(self, source_type)
     82 def __call__(self, source_type: Any, /) -> core_schema.CoreSchema:
---> 83     schema = self._handler(source_type)
     84     ref = schema.get('ref')
     85     if self._ref_mode == 'to-def':

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:2088, in GenerateSchema._apply_annotations.<locals>.inner_handler(obj)
   2086 from_property = self._generate_schema_from_property(obj, source_type)
   2087 if from_property is None:
-> 2088     schema = self._generate_schema_inner(obj)
   2089 else:
   2090     schema = from_property

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:929, in GenerateSchema._generate_schema_inner(self, obj)
    926 if isinstance(obj, PydanticRecursiveRef):
    927     return core_schema.definition_reference_schema(schema_ref=obj.type_ref)
--> 929 return self.match_type(obj)

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:1029, in GenerateSchema.match_type(self, obj)
   1027 origin = get_origin(obj)
   1028 if origin is not None:
-> 1029     return self._match_generic_type(obj, origin)
   1031 res = self._get_prepare_pydantic_annotations_for_known_type(obj, ())
   1032 if res is not None:

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:1062, in GenerateSchema._match_generic_type(self, obj, origin)
   1060     return self._tuple_schema(obj)
   1061 elif origin in LIST_TYPES:
-> 1062     return self._list_schema(self._get_first_arg_or_any(obj))
   1063 elif origin in SET_TYPES:
   1064     return self._set_schema(self._get_first_arg_or_any(obj))

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:431, in GenerateSchema._list_schema(self, items_type)
    430 def _list_schema(self, items_type: Any) -> CoreSchema:
--> 431     return core_schema.list_schema(self.generate_schema(items_type))

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:655, in GenerateSchema.generate_schema(self, obj, from_dunder_get_core_schema)
    652         schema = from_property
    654 if schema is None:
--> 655     schema = self._generate_schema_inner(obj)
    657 metadata_js_function = _extract_get_pydantic_json_schema(obj, schema)
    658 if metadata_js_function is not None:

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:929, in GenerateSchema._generate_schema_inner(self, obj)
    926 if isinstance(obj, PydanticRecursiveRef):
    927     return core_schema.definition_reference_schema(schema_ref=obj.type_ref)
--> 929 return self.match_type(obj)

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:999, in GenerateSchema.match_type(self, obj)
    997     return self._literal_schema(obj)
    998 elif is_typeddict(obj):
--> 999     return self._typed_dict_schema(obj, None)
   1000 elif _typing_extra.is_namedtuple(obj):
   1001     return self._namedtuple_schema(obj, None)

File ...\Lib\site-packages\pydantic\_internal\_generate_schema.py:1520, in GenerateSchema._typed_dict_schema(self, typed_dict_cls, origin)
   1516 title = self._get_model_title_from_config(typed_dict_cls, ConfigWrapper(config))
   1517 metadata = build_metadata_dict(
   1518     js_functions=[partial(modify_model_json_schema, cls=typed_dict_cls, title=title)],
   1519 )
-> 1520 td_schema = core_schema.typed_dict_schema(
   1521     fields,
   1522     cls=typed_dict_cls,
   1523     computed_fields=[
   1524         self._computed_field_schema(d, decorators.field_serializers)
   1525         for d in decorators.computed_fields.values()
   1526     ],
   1527     ref=typed_dict_ref,
   1528     metadata=metadata,
   1529     config=core_config,
   1530 )
   1532 schema = self._apply_model_serializers(td_schema, decorators.model_serializers.values())
   1533 schema = apply_model_validators(schema, decorators.model_validators.values(), 'all')

TypeError: typed_dict_schema() got an unexpected keyword argument 'cls'"
}