langchain-ai / langchain-google

MIT License
78 stars 81 forks source link

system messages don't work #199

Closed ueartificial closed 1 month ago

ueartificial commented 2 months ago

System messages don't work. I tried both gemini-1.5 and 1.0. I also tried convert_system_message_to_human=True and convert_system_message_to_human=False. None of these work, it doesn't follow system message.

lkuligin commented 2 months ago

can you provide a reproducible example, please?

ueartificial commented 2 months ago
import os
from dotenv import load_dotenv
load_dotenv()

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

from langchain_google_genai import ChatGoogleGenerativeAI 

llm = ChatGoogleGenerativeAI(model="gemini-1.5-pro-latest", google_api_key=os.getenv('gemini_api_key'))
system = "Your job is to generate a video title for a youtube channel with given topic. Title should be under 150 characters. Only answer with title, nothing else."
prompt = ChatPromptTemplate.from_messages([("system", system), ("human", "Topic: {topic}")])
chain = prompt | llm | StrOutputParser()

response = chain.invoke({"topic": "Music Instruments"})

print(response)

it prints a very long essay(around 3000 characters) about music instruments instead of a title I asked for. I have langchain-google-genai 1.0.3 installed btw. it doesn't change if I add convert_system_message_to_human=True while creating llm.

lkuligin commented 2 months ago

You're right, I'm sorry about this, let me prepare a fix.

ueartificial commented 2 months ago

Thank you! It works now by adding client = self.client as in the fix when using "gemini-1.5-pro-latest".

But it gives an error when using model "gemini-1.0-pro-latest" since system prompts are not supported with that model. In that case, using convert_system_message_to_human=True doesn't prevent the error.

lkuligin commented 1 month ago

It should be fixed now, please feel free to re-open if it's not.