phidatahq / phidata

Build AI Assistants with memory, knowledge and tools.
https://docs.phidata.com
Mozilla Public License 2.0
11.25k stars 1.67k forks source link

need enhancement for LLM response of function calling embedded in markdown #238

Open 1WorldCapture opened 4 months ago

1WorldCapture commented 4 months ago

Scenario: function calling Steps:

  1. run cookbook under "cookbook\llms\ollama\tools\app.py" using "streamlit run app.py"
  2. select llama3 model
  3. ask question about stock price of some company, like APPLE or GOOGLE

Problem: Sometimes LLM responses raw JSON text, and sometimes it embeds the reponse in markdown, like

{
....
}

or

{
.....
}

Either case could not be handled correctly in current code.

So is it possible or necessary to enhance the cookcode to handle such case? thanks.

ashpreetbedi commented 4 months ago

yup working on this :)

ju1987yetchung commented 4 months ago

I have same problem, I think it is because one parameter should be chosen: tool_choice <Union[str, Dict[str, Any]]>. On the document it is said that if there is tool defined, it is set to "auto" as default value. this means the AI robot can choose from answering a message or using a tool. So, under this conditon, from the documentation, this parameter seemly should be specified, but the pattern is defined blurry, hard to write right code.