Open webscale opened 7 months ago
@webscale did this work for you on any other version? I'm getting the same error, but new to guidance, not sure if this is a change to OpenAI API or guidance.
I am seeing this with guidance version 0.1.3
; text-davinci-003
continues to work.
@webscale did this work for you on any other version? I'm getting the same error, but new to guidance, not sure if this is a change to OpenAI API or guidance.
I have developed an application on 0.0.64.
It never worked post 0.1.
Hi @webscale and @ninowalker,
Our error message could be more helpful here. Most openAI models -- including gpt-3.5-turbo
-- are Chat based models, which means you need to use role tags to specify and structure your prompts. Guidance does this with context managers, e.g. this example from our README:
gpt = models.OpenAI("gpt-3.5-turbo")
with user():
lm = gpt + "What is the capital of France?"
with assistant():
lm += gen("capital")
with user():
lm += "What is one short surprising fact about it?"
with assistant():
lm += gen("fact")
Chat based models won't work without role assignments, but your prompt works fine if you add them in:
As @ninowalker noticed, OpenAI still supports some standard completion based models including text-davinci-003
. The most capable text completion model they have -- which works without needing role tags -- is gpt-3.5-turbo-instruct
(which is both cheaper and more capable than text-davinci-003). Consider using that model instead if you don't want to use Chat based models.
Just pushed a better error message for this in the future
I'm having the same problem as above. I'm using guidance 0.1.5, and the example from the github page:
from guidance import models, instruction, gen # type: ignore
gpt_instruct = models.OpenAI(
"gpt-3.5-turbo-instruct",
api_key=("sk-my_openai_key"))
with instruction():
lm = gpt_instruct + "What are the smallest cats?"
lm += gen(stop=".")
print(lm)
In the end, this returns
openai.BadRequestError: Error code: 400 - {'error': {'message': "[] is too short - 'messages'", 'type': 'invalid_request_error', 'param': None, 'code': None}}
and I have to kill the program manually as it does not return
I believe the issue here is that gpt-3.5-turbo-instruct was incorrectly being tagged as an OpenAI chat model (and not instruct). I pushed a fix to the regex that does this detection and it should be fixed in the next release. In the meanwhile, you may consider directly instantiating the Instruct specific class:
Or installing from source.
Apologies for any poor formatting, on my phone and replying via email 🙂
Get Outlook for iOShttps://aka.ms/o0ukef
From: motaatmo @.> Sent: Thursday, December 7, 2023 9:39:58 AM To: guidance-ai/guidance @.> Cc: Comment @.>; Subscribed @.> Subject: Re: [guidance-ai/guidance] openai not working on version 0.1.3 (Issue #471)
I'm having the same problem as above. I'm using guidance 0.1.5, and the example from the github page:
from guidance import models, instruction, gen # type: ignore
gpt_instruct = models.OpenAI( "gpt-3.5-turbo-instruct", api_key=("sk-my_openai_key"))
with instruction(): lm = gpt_instruct + "What are the smallest cats?"
lm += gen(stop=".")
print(lm)
In the end, this returns
openai.BadRequestError: Error code: 400 - {'error': {'message': "[] is too short - 'messages'", 'type': 'invalid_request_error', 'param': None, 'code': None}}
and I have to kill the program manually as it does not return
— Reply to this email directly, view it on GitHubhttps://github.com/guidance-ai/guidance/issues/471#issuecomment-1845803670 or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABIOOZ5M5CQSLSH6J2UTYHTYIH5O5BFKMF2HI4TJMJ2XIZLTSOBKK5TBNR2WLJDUOJ2WLJDOMFWWLO3UNBZGKYLEL5YGC4TUNFRWS4DBNZ2F6YLDORUXM2LUPGBKK5TBNR2WLJDUOJ2WLJDOMFWWLLTXMF2GG2C7MFRXI2LWNF2HTAVFOZQWY5LFUVUXG43VMWSG4YLNMWVXI2DSMVQWIX3UPFYGLLDTOVRGUZLDORPXI6LQMWWES43TOVSUG33NNVSW45FGORXXA2LDOOJIFJDUPFYGLKTSMVYG643JORXXE6NFOZQWY5LFVE2TMNBUGQZTENJVQKSHI6LQMWSWS43TOVS2K5TBNR2WLKRSGAYDONJSGU2TEMFHORZGSZ3HMVZKMY3SMVQXIZI. You are receiving this email because you commented on the thread.
Triage notifications on the go with GitHub Mobile for iOShttps://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Androidhttps://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
Ok, thank you!
The bug getting the following error openai.BadRequestError: Error code: 400 - {'error': {'message': "[] is too short - 'messages'", 'type': 'invalid_request_error', 'param': None, 'code': None}}
To Reproduce Give a full working code snippet that can be pasted into a notebook cell or python file. Make sure to include the LLM load step so we know which model you are using.
System info (please complete the following information):
guidance.__version_0.1_
):