MeetKai / functionary

Chat language model that can use tools and interpret the results
MIT License
1.37k stars 107 forks source link

Inconsistent behavior: Assistant makes assumptions about required parameters in subsequent calls #269

Closed MadanMaram closed 2 weeks ago

MadanMaram commented 2 weeks ago

When making standalone calls to the assistant using the same prompt, the behavior is inconsistent and potentially problematic. On the first call, the assistant correctly asks for missing required parameters (difficulty and format) for the find_content function. However, on an immediate subsequent call with the exact same prompt, the assistant makes assumptions about these parameters and calls the function without user input. Steps to Reproduce

Make a call to the assistant with the prompt: "can you find content on python?" Observe that the assistant asks for difficulty level and format. Without providing any additional information, make another identical call with the same prompt. Observe that the assistant now calls the find_content function with assumed values for difficulty ("Intermediate") and format ("Video").

Expected Behavior The assistant should consistently ask for missing required parameters and not make assumptions about them in subsequent calls, especially when no additional context is provided. Actual Behavior On the second call, the assistant arbitrarily sets values for required parameters that were not provided by the user. Additional Information

Model: functionary 3.2 Environment: Calls made via Postman Frequency: Occurs consistently when repeating the same query

jeffreymeetkai commented 2 weeks ago

Can you provide more information such as the exact request? There are a couple of things to make sure such as whether temperature is 0.0, and the exact model used. This is for reproducing too if temperature is indeed 0.0.

MadanMaram commented 2 weeks ago

I have set temperature is 0.0,its working as expected thanks.