google / langfun

OO for LLMs
Apache License 2.0
100 stars 17 forks source link

Introduce `lf.query_prompt` and `lf.query_output`. #204

Closed copybara-service[bot] closed 2 months ago

copybara-service[bot] commented 3 months ago

Introduce lf.query_prompt and lf.query_output.

These two APIs allow users to decompose lf.query into two stages: 1) lf.query_prompt: get the final prompt that will be sent to the LLM. 2) lf.query_output: get the structured output from LLM response.

With these two APIs, users could easily implement lf.query with batch LLM inferences (e.g. jax-on-beam).