matlab-deep-learning / llms-with-matlab

Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
Other
108 stars 23 forks source link

OpenAIChat "generate" ignores chat ModelName #80

Closed VenturiDTT closed 1 month ago

VenturiDTT commented 1 month ago

Hello, thanks for the useful repository. I found something that looks like a bug. Given the following dummy example:

chat = openAIChat("You are a chatbot", 'APIKey', API_key, 'ModelName', 'gpt-3.5-turbo')
txt = generate(chat, "Hello")

I would expect this code to use gpt-3.5-turbo, instead the default "gpt-4.0-mini" is invoked. This is because of line 212 in "openAIChat.m":

          nvp.ModelName           (1,1) string {mustBeModel} = "gpt-4o-mini"

This seems wrong considered the example "CreateSimpleChatBot.md", where it is stated that "This example uses the model gpt-3.5-turbo", but the code is:

modelName = "gpt-3.5-turbo";
chat = openAIChat("You are a helpful assistant. You reply in a very concise way, keeping answers limited to short sentences.", ModelName=modelName);
[text, response] = generate(chat, messages);

To actually use "gpt-3.5-turbo" without changing "openAIChat.m", one should change the last line:

.modelName = "gpt-3.5-turbo";
chat = openAIChat("You are a helpful assistant. You reply in a very concise way, keeping answers limited to short sentences.", ModelName=modelName);
[text, response] = generate(chat, messages, ModelName=modelName);

Sorry if I misinterpreted something.

ccreutzi commented 1 month ago

Thanks, that was an oversight, showing a lack of a test seam for tests that would have caught the issue.