Closed jemc closed 1 month ago
The example code in the README.md
for this feature is a bit ugly, I know.
I want to follow up in a future PR to add a higher-level function you can call which wraps the generateWithOptionalTools
function in a loop for you, and can call provided functions for you. Once I do that, I'll also update the README.md
to show that nicer / more convenient style of example.
But @theoremonemehdi and I agreed together that we want to expose the "single call" version of this for apps that want full control, even if/when we add a future "repeated calls" function which wraps the loop for a more convenient interface.
Sometimes we may want to ask the LLM to produce a natural language response, but with the option of using some tools (in a structured data format) as part of its self-directed process of fulfilling the prompt.
This is a bit of a mixture of the prior two methods in the interface - we are expecting the LLM to make zero or more tool calls, and then eventually produce a natural language response.
See the README.md example and the spec examples to get a feel for how this works.