Open JDAR-BP opened 6 months ago
Any chance it can be fixed in the next release? Its quite a big issue for a lot de cases... There is surely an issue in the code itself since I tested the API and it doesnt return the URL neither, only the text response from the API node, so there is something to be fixed in the code for sure.
hey @JDAR-BP to understand better, is your goal to have the LLM automatically figure out which API to call, and also able to have normal conversation whenever API call is not needed?
Heyy @HenryHengZJ thanks for the reply. Basically yes, this is what I want to do.
As I said, the Conversationnal agent is able to do it but the thing is that this agent is not really good when it comes to work with chat history and OpenAI Tool Agent is the best to work with tools anyway.
What do you think about it?
What if you try combining OpenAPIChain with ChainTool to OpenAIToolAgent?
Klarana OpenAPI spec I was using - https://www.klarna.com/us/shopping/public/openai/v0/api-docs/
Thanks @HenryHengZJ for the suggestion.
I tried it but unfortunately it doesn't work for me since I need to handle big databases (at least thousands items) and super precise results from query with several filters and for now, only API call that allow to build custom queries with Algolia bring this quality.
I don't think there is a better way to handle big databases and very precise queries like price<XXX AND color = XXX OR XXX for exemple.
I also tried Google search custom with my own search engine and it works fine but I can't filter by price under X and I don't get all webpage infos, juste title and snippet of the webpage.
Do you think something can be solved in the code itself to solve this issue?
Because if the user get the query body for POST or URL for GET everytime agent makes API call (except conversationnal agent), it seems like a big issue for me dont you think?
Hi guys, There is an issue with agents (except 'Conversational Agent') when using GET and POST API chain. For exemple with the GET API chain, the agent send the call url in the chat just before his response like here:
The thing that is even weirder is that in langsmith, langfuse or lunary that url doesnt appear anywhere in the response. So I assume its caused by the 'Retrieval' function of most of the agents, that's why it works with Conversational Agent.
But for my case, I need the OpenAI Tool Agent, but same issue with it.
By the way, the issue also happen when I check "return direct" on the API chain node. And if I don't check it, it makes 3 responses in the chat : the first is the url (or json structure with POST), the second is the API chain answer and the third is the agent response.
I don't know if anyone seen this before and found a work around.
Could it be possible to have an option to cut the 'Retrieval' function when we use agents and api chain?
Thanks guys, have a great day