Closed fguillen closed 1 week ago
The documentation of the OpenAI constructor is not helping because it looks outdated:
#initialize(api_key:, llm_options: {}, default_options: {}) ⇒ [OpenAI]
Initialize an OpenAI LLM instance
Parameters:
api_key (String) — The API key to use
client_options (Hash) — Options to pass to the OpenAI::Client constructor
I am also asking the SO community: https://stackoverflow.com/questions/78648116/ruby-langchainrb-gem-and-custom-configuration-for-the-model-setup
@fguillen If you'd like to configure your OpenAI client to use a specific temperature and model, try the following:
openai = Langchain::LLM::OpenAI.new(
api_key: ENV["OPENAI_API_KEY"],
default_options: {
temperature: 0.1,
chat_completion_model_name: "gpt-4o"
}
)
Let me know if it works!
@fguillen I believe the issue had been solved! Closing for now.
Yes @andreibondarev thanks for checking it out
Describe the bug
There needs to be a clear way of setting up the Model. In my case, I would like to use OpenAi and use:
In the README, there is a mention about using
llm_options
.If I go to the OpenAI documentation:
It says I have to check here:
But there is not any mention of
temperature
, for example. Also, in the example in theLangchain::LLM::OpenAI
documentation, the options are totally different.To Reproduce
Check my documentation research above
Expected behavior
I would like to see clear instructions of how to configure the Model
Desktop (please complete the following information):
Additional context
Thanks a lot for your work! :)