eyurtsev / kor

LLM(😽)
https://eyurtsev.github.io/kor/
MIT License
1.61k stars 88 forks source link

ValueError when running chain.run(...) #255

Closed Dionnay closed 8 months ago

Dionnay commented 8 months ago

I was trying to follow the example given in the API documentation, but I used GeminiPro instad of OpenAI and when I try to execute the chain.run()[data], I get this error.

ValueError Traceback (most recent call last) Cell In[124], line 1 ----> 1 chain.run("Eugene was 18 years old a long time ago.")['data']

File c:\Users\Gem2\gemtest\gem2\Lib\site-packages\langchain_core_api\deprecation.py:145, in deprecated..deprecate..warning_emitting_wrapper(*args, *kwargs) 143 warned = True 144 emit_warning() --> 145 return wrapped(args, **kwargs)

File c:\Users\Gem2\gemtest\gem2\Lib\site-packages\langchain\chains\base.py:538, in Chain.run(self, callbacks, tags, metadata, *args, **kwargs) 536 if len(args) != 1: 537 raise ValueError("run supports only one positional argument.") --> 538 return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[ 539 _output_key 540 ] 542 if kwargs and not args: 543 return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[ 544 _output_key 545 ]

File c:\Users\Gem2\gemtest\gem2\Lib\site-packages\langchain_core_api\deprecation.py:145, in deprecated..deprecate..warning_emitting_wrapper(*args, *kwargs) 143 warned = True 144 emit_warning() --> 145 return wrapped(args, **kwargs) ... To automatically convert the leading SystemMessage to a HumanMessage, set convert_system_message_to_human to True. Example:

llm = ChatGoogleGenerativeAI(model="gemini-pro", convert_system_message_to_human=True)

eyurtsev commented 8 months ago

Have you tried doing this:

llm = ChatGoogleGenerativeAI(model="gemini-pro", convert_system_message_to_human=True)

Sounds like the model doesn't support system messages which kor uses

eyurtsev commented 8 months ago

Closing as there appears to be a working solution suggested by the stack trace itself