patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.18k stars 156 forks source link

Correct instructions of how to config the model setup #675

Closed fguillen closed 1 week ago

fguillen commented 1 week ago

Describe the bug

There needs to be a clear way of setting up the Model. In my case, I would like to use OpenAi and use:

In the README, there is a mention about using llm_options.

If I go to the OpenAI documentation:

It says I have to check here:

But there is not any mention of temperature, for example. Also, in the example in the Langchain::LLM::OpenAI documentation, the options are totally different.

# ruby-openai options:

    CONFIG_KEYS = %i[
      api_type
      api_version
      access_token
      log_errors
      organization_id
      uri_base
      request_timeout
      extra_headers
    ].freeze
# Example in Class: Langchain::LLM::OpenAI documentation: 

{
  n: 1,
  temperature: 0.0,
  chat_completion_model_name: "gpt-3.5-turbo",
  embeddings_model_name: "text-embedding-3-small"
}.freeze

To Reproduce

Check my documentation research above

Expected behavior

I would like to see clear instructions of how to configure the Model

Desktop (please complete the following information):

Additional context

Thanks a lot for your work! :)

fguillen commented 1 week ago

The documentation of the OpenAI constructor is not helping because it looks outdated:

#initialize(api_key:, llm_options: {}, default_options: {}) ⇒ [OpenAI]
Initialize an OpenAI LLM instance

Parameters:

api_key (String) — The API key to use
client_options (Hash) — Options to pass to the OpenAI::Client constructor
fguillen commented 1 week ago

I am also asking the SO community: https://stackoverflow.com/questions/78648116/ruby-langchainrb-gem-and-custom-configuration-for-the-model-setup

andreibondarev commented 1 week ago

@fguillen If you'd like to configure your OpenAI client to use a specific temperature and model, try the following:

openai = Langchain::LLM::OpenAI.new(
  api_key: ENV["OPENAI_API_KEY"],
  default_options: {
    temperature: 0.1,
    chat_completion_model_name: "gpt-4o"
  }
)

Let me know if it works!

andreibondarev commented 1 week ago

@fguillen I believe the issue had been solved! Closing for now.

fguillen commented 1 day ago

Yes @andreibondarev thanks for checking it out