Open yigit353 opened 11 months ago
Hmm, to be honest I'm not completely sure where this error is coming from. Did you change the model in the configuration file too?
configurations.json
[
{
"model": "gpt-3.5-turbo-16k-0613",
"api_key": "sk-xxxxxxxxxxxxxx"
} ,
{
"model": "gpt-4-1106-preview",
"api_key": "sk-yyyyyyyyyyyyyyyy"
}
]
I even used different API keys for both models. And in the notebook:
config_list = autogen.config_list_from_json(
env_or_file="configurations.json",
file_location=configurations_path,
filter_dict={
# "model": ["gpt-4-1106-preview"],
"model": ["gpt-3.5-turbo-16k-0613"]
},
)
It makes a call to gpt3.5-turbo-16k, I can confirm it from the usage:
However, even though I changed the seed and temperature for the data retriever agent, it made up the same wrong function calls every time.
llm_config_data_retriever = {
"functions":get_flight_data_functions,
"config_list": config_list,
"seed": 150,
"temperature": 0.2
}
The project works excellently with
gpt-4-turbo,
although with some hiccups.When I change the model to
gpt-3.5-turbo-16k
, I get the error below. Is that a known issue of the AutoGEN, or is there something I should change too?The error is caused by this
gpt-4-turbo generates the correct function call while gpt-3.5-turbo makes up inexisting function: