Closed kottofy closed 8 months ago
If I can get a thumbs up that the methods can be changed from Text Completions to Chat Completions, I'd be happy to make a PR.
So, what happens here is that in your env the deployment name is set to a gpt35turbo model, rather then a gpt35turboinstruct, the latter is a text completion model, the former isn't, so there are two ways to fix this, either change your env or indeed change the notebook to use chatcompletion classes. I think that should be fine (and openai is indeed moving away from completion in favor of chat completion), but haven't looked at that notebook for a while, so feel free to make a PR and if it works, it works!
Hi @kottofy, as @eavanvalkenburg pointed out this looks to be a config issue. I tried to reproduce the error, but since my AzureOpenAI deployment is configured as gpt-35-turbo-instruct
(the latest text completion model), it works. I am closing this since it is related to user configuration.
Describe the bug Error occurs when running Python Notebook 3 when referencing
AzureTextCompletion
. This appears to be deprecated so maybe this notebook needs a refactor to use Chat Completion methods?To Reproduce Steps to reproduce the behavior:
03-prompt-function-inline.ipynb
withselectedService = Service.AzureOpenAI
summary = await kernel.invoke(summarize, sk.KernelArguments(input=input_text))
Expected behavior Expect to get a response from the LLM
Screenshots
Platform