patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.18k stars 156 forks source link

Google Gemini LLM generation config problem #648

Closed mazenkhalil closed 6 days ago

mazenkhalil commented 4 weeks ago

The following line in Google Gemini LLM seems to be wrong. If the temperature is defined, the whole generation_config object gets replaced thus ignoring any pre defined parameters there!

https://github.com/patterns-ai-core/langchainrb/blob/0efc39269e5c2c123886fa34e10b28c682ada948/lib/langchain/llm/google_gemini.rb#L45

andreibondarev commented 4 weeks ago

@mazenkhalil Do you have a suggested fix?

I think we should set the possible values directly, something like the following:

parameters[:generation_config] ||= {}
parameters[:generation_config].merge({temperature: parameters.delete(:temperature)}) if parameters[:temperature]
parameters[:generation_config].merge({top_k: parameters.delete(:top_k)}) if parameters[:top_k]
...