Hello,
I have a InvalidRequestError when trying to do a ZeroShotGPTClassifier, whereas I can call a ChatCompletion with the same model, here is my code:
from skllm.config import SKLLMConfig
SKLLMConfig.set_openai_key(openai.api_key)
SKLLMConfig.set_azure_api_base(openai.api_base)
from skllm import ZeroShotGPTClassifier
from skllm.datasets import get_classification_dataset
X = df[['cause','solution']].values
y = df['Causes_FinalNature_ENG'].values
This code crashes with that error:
ould not obtain the completion after 3 retries: InvalidRequestError :: Invalid URL (POST /v1/openai/deployments/gpt-4-32k/chat/completions)
But when I run this code it works well:
response = openai.ChatCompletion.create(
engine='gpt-4-32k',
messages=[
{"role": "user", "content": "Who won the world series in 2020?"}],
max_tokens=193,
temperature=0,
)
Hello, I have a InvalidRequestError when trying to do a ZeroShotGPTClassifier, whereas I can call a ChatCompletion with the same model, here is my code:
from skllm.config import SKLLMConfig SKLLMConfig.set_openai_key(openai.api_key) SKLLMConfig.set_azure_api_base(openai.api_base)
from skllm import ZeroShotGPTClassifier from skllm.datasets import get_classification_dataset
X = df[['cause','solution']].values y = df['Causes_FinalNature_ENG'].values
defining the model
clf = ZeroShotGPTClassifier(openai_model="azure::gpt-4-32k")
fitting the data
clf.fit(X, y)
predicting the data
labels = clf.predict(X)
This code crashes with that error: ould not obtain the completion after 3 retries:
InvalidRequestError :: Invalid URL (POST /v1/openai/deployments/gpt-4-32k/chat/completions)
But when I run this code it works well: response = openai.ChatCompletion.create( engine='gpt-4-32k', messages=[ {"role": "user", "content": "Who won the world series in 2020?"}], max_tokens=193, temperature=0, )
Same stuff with gpt-35-turbo and gpt-3.5-turbo.
Do you know what's wrong with my code ?
Thanks in advance