Closed Kevdome3000 closed 1 year ago
Hi, I created PR which reproduces the issues. I hope this way we can keep this regression out forever
I believe these are two related but separate issues 1) being introduced by adding new properties in CompleteRequestSettings.ResultsPerPrompt the result I am receiving from the server is : Something went wrong while rendering the semantic function or while executing the text completion. Function: _GLOBALFUNCTIONS.funce1d8a50ff39147fcb90d7523f8aa40f2. Error: Invalid request: The request is not valid, HTTP status: 400. Details: logprobs, best_of and echo parameters are not available on gpt-35-turbo model. Please remove the parameter and try again. For more details, see https://go.microsoft.com/fwlink/?linkid=2227346. Status: 400 (BadRequest) ErrorCode: BadRequest
Content: {"error":{"code":"BadRequest","message":"logprobs, best_of and echo parameters are not available on gpt-35-turbo model. Please remove the parameter and try again. For more details, see https://go.microsoft.com/fwlink/?linkid=2227346."}}
2) when using OpenAi completiions with chat-model, this once returns "error": { "message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?", "type": "invalid_request_error", "param": "model", "code": null }
this is the PR https://github.com/microsoft/semantic-kernel/pull/1061 which reproduced issues. it has two separate tests one in .Issues/Issue1050 (for lack of better name) and another in KernelSyntaxExamples/Example27SemanticFunctionsUsingChatGptTests
please let me know if I need to change
I just looked into the code a bit, and it looks to me that both of the problems are the consequence that we are sending TextcompletionRequest to the ChatCompletion API. now this is really a great functionality but at the moment it is being handled by the API server which obviously does not work (on azure or openai side).
We should maybe try handlig it on a higher level and checking if SKFunction._aiService https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel/SkillDefinition/SKFunction.cs#L372 can be casted to IChatCompletion and than it should be invoked as as IChatCompletion and not as ITextCompletion
If this fits your architecture , just let me know I can give it a try today
We have a fix coming for this.
Issue resolved.
Describe the bug
Registering OpenAIChatCompletionService as both ITextCompletion and IChatCompletion does not work and it throws the following error:
I have to also register an ITextCompletion service for the kernel to execute Semantic functions properly but the text completion with the text-devinci-003 model does not work properly, likely because it's not smart enough. For my use case, it returns invalid JSON most of the time.
To Reproduce
skprompt.txt:
test method:
Expected behavior
I should be able to register OpenAIChatCompletionService as both ITextCompletion and IChatCompletion and run Semantic Functions via the Kernel
Screenshots
Entire sample code with comments:
PromptTemplate:
Sample class:
Desktop (please complete the following information):
Additional context
PromptGenerator just stores a Dictionary<string, SemanticFunctionConfig> and a Method: