Open FedericoOmoto opened 1 month ago
Same with VertexAI help
Hi! My name is Arina, I'm working with a team at UTSC as part of a group project to contribute to LangChain. We're interested in working on this issue, and were hoping to take a crack at it. Is there any additional information that can be provided to help us tackle this?
Hi Arina! Thank you for taking care of this issue, if there is any additional information that will help you and your team to work on fixing this issue, please don't hesitate to let me know.
As an update, since I created the GH issue, we have updated to the latest versions of the dependencies:
Hi @FedeOmoto! We were able to successfully pull the prompt from LangSmith, but we ran into a different issue during invocation:
`Error during invocation: Invalid input type <class 'langchain_core.messages.ai.AIMessage'>. Must be a PromptValue, str, or list of BaseMessages.
Full Traceback:
Traceback (most recent call last):
File "/Users/Arina/GitHub/langchain/test.py", line 55, in
Here is the example code:
try:
print("Attempting to invoke the chain with a list of messages...")
input_message = [HumanMessage(content="What is the definition of water?")]
# Invoke the chain with a list of BaseMessages
response = chain.invoke(input_message)
print(f"Response: {response}")
except Exception as e:
print(f"Error during invocation: {e}")
traceback.print_exc()
We were wondering if your original issue still persists or if any recent updates have resolved it on your end. In the meantime, we will be investigating the "Invalid input type" issue further to find a solution.
Looking forward to your input!
@FedeOmoto I am working with @arinaazmi on this issue. I was able to successfully execute the example code using the following dependency versions:
langchain: 0.3.5
langchain-core: 0.3.13
langchain-google-genai: 2.0.3
langchain-openai: 0.2.4
langchain-text-splitters: 0.3.1
langsmith: 0.1.137
The output I receive is:
first=ChatPromptTemplate(input_variables=['question'], input_types={}, partial_variables={}, metadata={'lc_hub_owner': '-', 'lc_hub_repo': 'test-prompt', 'lc_hub_commit_hash': 'b59240cedfb86d67cd43e3dc971ade092126e025ad7ec844bdb95d5d6ec7b960'}, messages=[SystemMessagePromptTemplate(prompt=PromptTemplate(input_variables=[], input_types={}, partial_variables={}, template='You are a chatbot.'), additional_kwargs={}), HumanMessagePromptTemplate(prompt=PromptTemplate(input_variables=['question'], input_types={}, partial_variables={}, template='{question} Is your name Arina?'), additional_kwargs={})]) middle=[] last=RunnableBinding(bound=ChatGoogleGenerativeAI(model='models/gemini-1.5-pro', google_api_key=SecretStr('**********'), temperature=0.0, top_p=1.0, client=<google.ai.generativelanguage_v1beta.services.generative_service.client.GenerativeServiceClient object at 0x7f45f105a310>, default_metadata=()), kwargs={}, config={}, config_factories=[])
Is it possible for you to upgrade and try with the aforementioned dependency versions?
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
File "/home/federicoomoto/src/fairway/pr/fairway-fastapi/app/firebase/prompts.py", line 14, in get_prompt prompt = prompts.pull( ^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain/hub.py", line 114, in pull response = client.pull_prompt(owner_repo_commit, include_model=include_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langsmith/client.py", line 5301, in pull_prompt prompt = loads(json.dumps(prompt_object.manifest)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper return wrapped(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/load/load.py", line 181, in loads return json.loads( ^^^^^^^^^^^ File "/usr/lib64/python3.12/json/init.py", line 359, in loads return cls(kw).decode(s) ^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.12/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.12/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/federicoomoto/.cache/pypoetry/virtualenvs/fairway-fastapi-VcPivsSZ-py3.12/lib/python3.12/site-packages/langchain_core/load/load.py", line 119, in call raise ValueError( ValueError: Trying to deserialize something that cannot be deserialized in current version of langchain-core: ('langchain', 'chat_models', 'google_genai', 'ChatGoogleGenerativeAI')
Description
When trying to pull a Google GenAI prompt from LangSmith, including the model, I get the following exception:
ValueError: Trying to deserialize something that cannot be deserialized in current version of langchain-core: ('langchain', 'chat_models', 'google_genai', 'ChatGoogleGenerativeAI')
If I try to pull the prompt without the model, everything works as expected.
The model config is as follows:
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies