langchain-ai / langchain-aws

Build LangChain Applications on AWS
MIT License
106 stars 85 forks source link

Breaking on import #251

Closed w601sxs closed 4 weeks ago

w601sxs commented 1 month ago

Code

from langchain.agents import AgentType,initialize_agent,load_tools
from langchain_aws import BedrockLLM, ChatBedrockConverse, ChatBedrock, BedrockEmbeddings

Error:

[/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/__init__.py:1](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/__init__.py#line=0): LangChainDeprecationWarning: As of langchain-core 0.3.0, LangChain uses pydantic v2 internally. The langchain_core.pydantic_v1 module was a compatibility shim for pydantic v1, and should no longer be used. Please update the code to import from Pydantic directly.

For example, replace imports like: `from langchain_core.pydantic_v1 import BaseModel`
with: `from pydantic import BaseModel`
or the v1 compatibility namespace if you are working in a code base that has not been fully upgraded to pydantic 2 yet.     from pydantic.v1 import BaseModel

  from langchain_aws.chat_models.bedrock import BedrockChat, ChatBedrock
---------------------------------------------------------------------------
SchemaError                               Traceback (most recent call last)
Cell In[16], line 2
      1 from langchain.agents import AgentType,initialize_agent,load_tools
----> 2 from langchain_aws import BedrockLLM, ChatBedrockConverse, ChatBedrock, BedrockEmbeddings
      3 from langchain.tools import ShellTool
      4 # from langchain_community.chat_models import BedrockChat
      5 
      6 # llm = BedrockLLM(model_id="amazon.titan-text-premier-v1:0")
      7 
      8 # llm = ChatBedrockConverse(model_id="anthropic.claude-3-sonnet-20240229-v1:0")

File [/opt/conda/lib/python3.10/site-packages/langchain_aws/__init__.py:1](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/__init__.py#line=0)
----> 1 from langchain_aws.chat_models import BedrockChat, ChatBedrock
      2 from langchain_aws.embeddings import BedrockEmbeddings
      3 from langchain_aws.graphs import NeptuneAnalyticsGraph, NeptuneGraph

File [/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/__init__.py:1](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/__init__.py#line=0)
----> 1 from langchain_aws.chat_models.bedrock import BedrockChat, ChatBedrock
      3 __all__ = ["BedrockChat", "ChatBedrock"]

File [/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/bedrock.py:38](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/chat_models/bedrock.py#line=37)
     35 from langchain_core.tools import BaseTool
     37 from langchain_aws.function_calling import convert_to_anthropic_tool, get_system_message
---> 38 from langchain_aws.llms.bedrock import (
     39     BedrockBase,
     40     _combine_generation_info_for_llm_result,
     41 )
     42 from langchain_aws.utils import (
     43     get_num_tokens_anthropic,
     44     get_token_ids_anthropic,
     45 )
     48 def _convert_one_message_to_text_llama(message: BaseMessage) -> str:

File [/opt/conda/lib/python3.10/site-packages/langchain_aws/llms/__init__.py:1](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/llms/__init__.py#line=0)
----> 1 from langchain_aws.llms.bedrock import (
      2     ALTERNATION_ERROR,
      3     Bedrock,
      4     BedrockBase,
      5     BedrockLLM,
      6     LLMInputOutputAdapter,
      7 )
      8 from langchain_aws.llms.sagemaker_endpoint import SagemakerEndpoint
     10 __all__ = [
     11     "ALTERNATION_ERROR",
     12     "Bedrock",
   (...)
     16     "SagemakerEndpoint",
     17 ]

File [/opt/conda/lib/python3.10/site-packages/langchain_aws/llms/bedrock.py:804](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/langchain_aws/llms/bedrock.py#line=803)
    800             elif run_manager is not None:
    801                 run_manager.on_llm_new_token(chunk.text, chunk=chunk)  # type: ignore[unused-coroutine]
--> 804 class BedrockLLM(LLM, BedrockBase):
    805     """Bedrock models.
    806 
    807     To authenticate, the AWS client uses the following methods to
   (...)
    815     access the Bedrock service.
    816     """
    818     """
    819     Example:
    820         .. code-block:: python
   (...)
    829 
    830     """

File [/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py:224](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py#line=223), in ModelMetaclass.__new__(mcs, cls_name, bases, namespace, __pydantic_generic_metadata__, __pydantic_reset_parent_namespace__, _create_model_module, **kwargs)
    221 if config_wrapper.frozen and '__hash__' not in namespace:
    222     set_default_hash_func(cls, bases)
--> 224 complete_model_class(
    225     cls,
    226     cls_name,
    227     config_wrapper,
    228     raise_errors=False,
    229     types_namespace=types_namespace,
    230     create_model_module=_create_model_module,
    231 )
    233 # If this is placed before the complete_model_class call above,
    234 # the generic computed fields return type is set to PydanticUndefined
    235 cls.model_computed_fields = {k: v.info for k, v in cls.__pydantic_decorators__.computed_fields.items()}

File [/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py:587](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_model_construction.py#line=586), in complete_model_class(cls, cls_name, config_wrapper, raise_errors, types_namespace, create_model_module)
    584 core_config = config_wrapper.core_config(cls)
    586 try:
--> 587     schema = gen_schema.clean_schema(schema)
    588 except gen_schema.CollectedInvalid:
    589     set_model_mocks(cls, cls_name)

File [/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_generate_schema.py:595](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_generate_schema.py#line=594), in GenerateSchema.clean_schema(self, schema)
    593     raise self.CollectedInvalid()
    594 schema = _discriminated_union.apply_discriminators(schema)
--> 595 schema = validate_core_schema(schema)
    596 return schema

File [/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_core_utils.py:570](https://gfqjo6kfp5hpjmj.studio.us-west-2.sagemaker.aws/opt/conda/lib/python3.10/site-packages/pydantic/_internal/_core_utils.py#line=569), in validate_core_schema(schema)
    568 if 'PYDANTIC_SKIP_VALIDATING_CORE_SCHEMAS' in os.environ:
    569     return schema
--> 570 return _validate_core_schema(schema)

SchemaError: Invalid Schema:
model.config.extra_fields_behavior
  Input should be 'allow', 'forbid' or 'ignore' [type=literal_error, input_value=<Extra.forbid: 'forbid'>, input_type=Extra]
    For further information visit https://errors.pydantic.dev/2.9/v/literal_error
w601sxs commented 1 month ago

Name: langchain-aws Version: 0.1.6 Summary: An integration package connecting AWS and LangChain Home-page: https://github.com/langchain-ai/langchain-aws Author: Author-email: License: MIT Location: /opt/conda/lib/python3.10/site-packages Requires: boto3, langchain-core, numpy Required-by:

Name: langchain Version: 0.3.4 Summary: Building applications with LLMs through composability Home-page: https://github.com/langchain-ai/langchain Author: Author-email: License: MIT Location: /opt/conda/lib/python3.10/site-packages Requires: aiohttp, async-timeout, langchain-core, langchain-text-splitters, langsmith, numpy, pydantic, PyYAML, requests, SQLAlchemy, tenacity Required-by: jupyter_ai_magics, langchain-community, ragas Note: you may need to restart the kernel to use updated packages.

3coins commented 4 weeks ago

@w601sxs If you are using v0.1.x of langchain-aws, the compatible version of langchain is v0.2.x. For LangChain versions 3 and above, use v0.2.23 for langchain-aws.