langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.72k stars 15.33k forks source link

Exception when trying to pull Google GenAI prompt #26554

Open FedericoOmoto opened 1 month ago

FedericoOmoto commented 1 month ago

Checked other resources

Example Code

from langchain import hub as prompts
prompts.pull("prompt_name", include_model=True)

Error Message and Stack Trace (if applicable)

File "/home/federicoomoto/src/fairway/pr/fairway-fastapi/app/firebase/prompts.py", line 14, in get_prompt prompt = prompts.pull( ^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain/hub.py", line 114, in pull response = client.pull_prompt(owner_repo_commit, include_model=include_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langsmith/client.py", line 5301, in pull_prompt prompt = loads(json.dumps(prompt_object.manifest)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper return wrapped(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/load/load.py", line 181, in loads return json.loads( ^^^^^^^^^^^ File "/usr/lib64/python3.12/json/init.py", line 359, in loads return cls(kw).decode(s) ^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.12/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.12/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/load/load.py", line 119, in call raise ValueError( ValueError: Trying to deserialize something that cannot be deserialized in current version of langchain-core: ('langchain', 'chat_models', 'google_genai', 'ChatGoogleGenerativeAI')

Description

When trying to pull a Google GenAI prompt from LangSmith, including the model, I get the following exception:

ValueError: Trying to deserialize something that cannot be deserialized in current version of langchain-core: ('langchain', 'chat_models', 'google_genai', 'ChatGoogleGenerativeAI')

If I try to pull the prompt without the model, everything works as expected.

The model config is as follows:

image

System Info

System Information

OS: Linux OS Version: #1 SMP PREEMPT_DYNAMIC Sun Aug 11 15:32:50 UTC 2024 Python Version: 3.12.4 (main, Jun 7 2024, 00:00:00) [GCC 14.1.1 20240607 (Red Hat 14.1.1-5)]

Package Information

langchain_core: 0.3.0 langchain: 0.3.0 langsmith: 0.1.121 langchain_google_genai: 2.0.0 langchain_openai: 0.2.0 langchain_text_splitters: 0.3.0

Optional packages not installed

langgraph langserve

Other Dependencies

aiohttp: 3.10.5 async-timeout: Installed. No version info available. google-generativeai: 0.7.2 httpx: 0.27.2 jsonpatch: 1.33 numpy: 1.26.4 openai: 1.45.1 orjson: 3.10.7 packaging: 24.1 pillow: 10.4.0 pydantic: 2.9.1 PyYAML: 6.0.2 requests: 2.32.3 SQLAlchemy: 2.0.34 tenacity: 8.5.0 tiktoken: 0.7.0 typing-extensions: 4.12.2

mfernandezsidn commented 1 month ago

Same with VertexAI help

arinaazmi commented 3 weeks ago

Hi! My name is Arina, I'm working with a team at UTSC as part of a group project to contribute to LangChain. We're interested in working on this issue, and were hoping to take a crack at it. Is there any additional information that can be provided to help us tackle this?

FedeOmoto commented 3 weeks ago

Hi Arina! Thank you for taking care of this issue, if there is any additional information that will help you and your team to work on fixing this issue, please don't hesitate to let me know.

FedeOmoto commented 3 weeks ago

As an update, since I created the GH issue, we have updated to the latest versions of the dependencies:

arinaazmi commented 3 weeks ago

Hi @FedeOmoto! We were able to successfully pull the prompt from LangSmith, but we ran into a different issue during invocation:

`Error during invocation: Invalid input type <class 'langchain_core.messages.ai.AIMessage'>. Must be a PromptValue, str, or list of BaseMessages. Full Traceback: Traceback (most recent call last): File "/Users/Arina/GitHub/langchain/test.py", line 55, in response = chain.invoke(input_message) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/Arina/anaconda3/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3024, in invoke input = context.run(step.invoke, input, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/Arina/anaconda3/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 391, in invoke [self._convert_input(input)], ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/Arina/anaconda3/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 341, in _convert_input raise ValueError(msg) ValueError: Invalid input type <class 'langchain_core.messages.ai.AIMessage'>. Must be a PromptValue, str, or list of BaseMessages.

Here is the example code:

try:
    print("Attempting to invoke the chain with a list of messages...")
    input_message = [HumanMessage(content="What is the definition of water?")]
    # Invoke the chain with a list of BaseMessages
    response = chain.invoke(input_message)
    print(f"Response: {response}")
except Exception as e:
    print(f"Error during invocation: {e}")
    traceback.print_exc()

We were wondering if your original issue still persists or if any recent updates have resolved it on your end. In the meantime, we will be investigating the "Invalid input type" issue further to find a solution.

Looking forward to your input!

FarhanChowdhury248 commented 1 week ago

@FedeOmoto I am working with @arinaazmi on this issue. I was able to successfully execute the example code using the following dependency versions:

langchain: 0.3.5
langchain-core: 0.3.13
langchain-google-genai: 2.0.3
langchain-openai: 0.2.4
langchain-text-splitters: 0.3.1
langsmith: 0.1.137

The output I receive is:

first=ChatPromptTemplate(input_variables=['question'], input_types={}, partial_variables={}, metadata={'lc_hub_owner': '-', 'lc_hub_repo': 'test-prompt', 'lc_hub_commit_hash': 'b59240cedfb86d67cd43e3dc971ade092126e025ad7ec844bdb95d5d6ec7b960'}, messages=[SystemMessagePromptTemplate(prompt=PromptTemplate(input_variables=[], input_types={}, partial_variables={}, template='You are a chatbot.'), additional_kwargs={}), HumanMessagePromptTemplate(prompt=PromptTemplate(input_variables=['question'], input_types={}, partial_variables={}, template='{question} Is your name Arina?'), additional_kwargs={})]) middle=[] last=RunnableBinding(bound=ChatGoogleGenerativeAI(model='models/gemini-1.5-pro', google_api_key=SecretStr('**********'), temperature=0.0, top_p=1.0, client=<google.ai.generativelanguage_v1beta.services.generative_service.client.GenerativeServiceClient object at 0x7f45f105a310>, default_metadata=()), kwargs={}, config={}, config_factories=[])

Is it possible for you to upgrade and try with the aforementioned dependency versions?