Closed nortline closed 1 year ago
One thing you can do is set SKETCH_ENDPOINT_URL
to some fake url eg. os.environ['SKETCH_ENDPOINT_URL'] = 'broken'
, this should cause (when using remote lambdaprompt) an error.
I just checked with a new notebook, and the code you included worked for me (just copy pasted from your question).
Note that it will say (with newest version of sketch, run pip install -U sketch
) something along the lines of
No backend set for [completion], setting to default of OpenAI
when running off of your own key rather than sending to the endpoint_url, so if you don't see that print statement the first time you run something, then you know there is probably an issue with spelling one of the env vars or something.
Additionally, you can check if a local backend was set up for lambdaprompt by doing
import lambdaprompt as lp
print(lp.backends.backends)
in the case that you are still using remote, lp.backends.backends
will be an empty dictionary.
Going to close since this seems like it works, but if you can give me more examples of how its failing you I can re-open the issue.
Looks like its working to me. The fact that it says no backend set
means that it defaulted to open-ai and then it finished the completion in local mode.
It shouldn't pop up that warning anymore after that.
`import pandas as pd import sketch import os from dotenv import load_dotenv
os.environ['OPENAI_API_KEY'] = 'key' os.environ['SKETCH_USE_REMOTE_LAMBDAPROMPT'] = 'False'`
and it doesn't seem to work