Closed tolo closed 4 months ago
Hey @tolo ,
Thanks for the suggestion. It was something I had in mind as I already got a couple of reports of people getting confused about why the wrong model was used (e.g. when binding tools).
I would still like to keep model
as part of the options, as that allows the use of the same wrapper (and underlying HTTP client) for calling different models.
So maybe we can have something like:
const defaultModel = 'gpt-3.5-turbo';
//...
class ChatOpenAI extends BaseChatModel<ChatOpenAIOptions> {
ChatOpenAI({
//...
super.defaultOptions = const ChatOpenAIOptions(
model: defaultModel,
),
});
//...
CreateChatCompletionRequest _createChatCompletionRequest(
final List<ChatMessage> messages, {
final ChatOpenAIOptions? options,
final bool stream = false,
}) {
return CreateChatCompletionRequest(
model: ChatCompletionModel.modelId(
options?.model ?? defaultOptions.model ?? defaultModel,
),
//...
);
)
And remove the default value from the ChatOpenAIOptions
constructor.
What do you think?
Yes, the model property should definitely remain in the options classes, it's just the default value in the constructor that should be removed. So above (falling back on default model instead of throwing error I guess?) looks good provided that is done. 👍
This should further improve the binding experience: https://github.com/davidmigloz/langchain_dart/pull/500
Feature request
LanguageModelOptions
(likeChatOpenAIOptions
,ChatMistralAIOptions
etc) should not set a default model, as this will reset the model to the default one every time you use thebind
operator (for instance). Example:Motivation
Removing the default value (and moving it into the Chat/LLM API implementations) will improve the developer experience and reduce the likelihood of errors (using the incorrect model). It would also be more in line with the Python implementation.
Your contribution
Could possibly create a PR/PRs (spans multiple packages) if need be.