guidance-ai / guidance

A guidance language for controlling large language models.
MIT License
18.72k stars 1.03k forks source link

Support OpenAI function calling #227

Open davidfant opened 1 year ago

davidfant commented 1 year ago

Is your feature request related to a problem? Please describe. OpenAI released function calling today: https://openai.com/blog/function-calling-and-other-api-updates

Describe the solution you'd like

Describe alternatives you've considered Using the openai Python package directly

Additional context https://platform.openai.com/docs/guides/gpt/function-calling https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb

chrissharkey commented 1 year ago

I just started using Guidance and it is fantastic! As soon as this announcement came out today I came here to look to hope someone had suggested this. I really want the benefit of this OpenAI update, but want to keep using Guidance! How exciting!

sasoder commented 1 year ago

Glad to see this brought up, I really need this - the lack of it blocks my usage of guidance.

TheSylvester commented 1 year ago

Without this functionality ASAP, it is hard for anyone building tool based applications to stay on Guidance.

It is currently a toss-up of keeping Guidance for easy generation, or ripping it out so you don't need lengthy teaching prompts and custom interfaces

slundberg commented 1 year ago

This is coming very soon. If not tomorrow early next week. See discussion on #239

adrianlyjak commented 1 year ago

I’m curious to hear the maintainers thoughts on how this could fit in with this library. Seems like being able to control output schema is one of the core value propositions of guidance. With functions, OpenAI can now kind of offer a way to do this, but just for json. It doesn’t quite match with guidance’s current model, but has similar intentions. Just a bit more restrictive.

I don’t really like the idea that certain behaviors are only offered by specific model backends. It’s nice that guidance is an abstraction layer to easily switch out backends. Implementing an API that only works with OpenAI would break this unless i others start embracing it as a standard.

I suppose guidance could adopt OpenAI’s functions model wholesale, and give first class support for it, and then have a compatibility layer that implements functions output with local chat models in order to bridge the gap.

Edit: ah I was typing this up simultaneous to when slundberg linked the PR. Exciting!

davidfant commented 1 year ago

This is coming very soon. If not tomorrow early next week. See discussion on #239

Appreciate the hard work! 🙌

slundberg commented 1 year ago

I don’t really like the idea that certain behaviors are only offered by specific model backends. It’s nice that guidance is an abstraction layer to easily switch out backends. Implementing an API that only works with OpenAI would break this unless i others start embracing it as a standard.

Agreed. The goal here is to support OpenAI through a mechanism that reflects how they actually implement it on the backend (because Guidance is meant to align with the actual LLM processing sequence), while also being flexible enough to capture how people get other models to use tools as well. I expect that some aspects of the process will be different for different models when those models are fine tuned for different syntaxes, but that should hopefully just be differences in how you write your prompt program, and not a fundamental change in the Guidance commands you use.

raivatshah commented 1 year ago

Is function calling support official? I want to double confirm because although the PR has been merged, this issue is still open and the tool use example is still not part of the main README

maziyarpanahi commented 5 months ago

Maybe it's me, I failed to find any examples on Functional Calling

rayli09 commented 4 days ago

any updates here?