guidance-ai / guidance

A guidance language for controlling large language models.
MIT License
18.81k stars 1.04k forks source link

Wrapping `gen` call in `token_limit` raises exception #853

Open hudson-ai opened 4 months ago

hudson-ai commented 4 months ago

The bug Wrapping gen() in token_limit raises 'ModelVariable' object has no attribute 'max_tokens'. Gen has the max_tokens kwarg, which is clearly the "preferred" way of limiting tokens of a gen, but there's no obvious indication that you shouldn't be able to do it the other way (besides this exception). The exception is rather opaque and doesn't tell the user what they did wrong.

I'm guessing that the issue may be due to the commit point inside of the gen.

To Reproduce

from guidance import gen, token_limit
token_limit(gen(), max_tokens=100)

System info (please complete the following information):