Open davidfant opened 1 year ago
I just started using Guidance and it is fantastic! As soon as this announcement came out today I came here to look to hope someone had suggested this. I really want the benefit of this OpenAI update, but want to keep using Guidance! How exciting!
Glad to see this brought up, I really need this - the lack of it blocks my usage of guidance.
Without this functionality ASAP, it is hard for anyone building tool based applications to stay on Guidance.
It is currently a toss-up of keeping Guidance for easy generation, or ripping it out so you don't need lengthy teaching prompts and custom interfaces
This is coming very soon. If not tomorrow early next week. See discussion on #239
I’m curious to hear the maintainers thoughts on how this could fit in with this library. Seems like being able to control output schema is one of the core value propositions of guidance. With functions, OpenAI can now kind of offer a way to do this, but just for json. It doesn’t quite match with guidance’s current model, but has similar intentions. Just a bit more restrictive.
I don’t really like the idea that certain behaviors are only offered by specific model backends. It’s nice that guidance is an abstraction layer to easily switch out backends. Implementing an API that only works with OpenAI would break this unless i others start embracing it as a standard.
I suppose guidance could adopt OpenAI’s functions model wholesale, and give first class support for it, and then have a compatibility layer that implements functions output with local chat models in order to bridge the gap.
Edit: ah I was typing this up simultaneous to when slundberg linked the PR. Exciting!
This is coming very soon. If not tomorrow early next week. See discussion on #239
Appreciate the hard work! 🙌
I don’t really like the idea that certain behaviors are only offered by specific model backends. It’s nice that guidance is an abstraction layer to easily switch out backends. Implementing an API that only works with OpenAI would break this unless i others start embracing it as a standard.
Agreed. The goal here is to support OpenAI through a mechanism that reflects how they actually implement it on the backend (because Guidance is meant to align with the actual LLM processing sequence), while also being flexible enough to capture how people get other models to use tools as well. I expect that some aspects of the process will be different for different models when those models are fine tuned for different syntaxes, but that should hopefully just be differences in how you write your prompt program, and not a fundamental change in the Guidance commands you use.
Is function calling support official? I want to double confirm because although the PR has been merged, this issue is still open and the tool use example is still not part of the main README
Maybe it's me, I failed to find any examples on Functional Calling
any updates here?
Is your feature request related to a problem? Please describe. OpenAI released function calling today: https://openai.com/blog/function-calling-and-other-api-updates
Describe the solution you'd like
function
(like user, system, assistant)function_call
response if available and provide both the function name and args in the responseDescribe alternatives you've considered Using the openai Python package directly
Additional context https://platform.openai.com/docs/guides/gpt/function-calling https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb