microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.14k stars 3.1k forks source link

KernelHttpServer sample app returns 404 #1303

Closed andreas-deruiter closed 1 year ago

andreas-deruiter commented 1 year ago

Describe the bug KernelHttpServer is calling the Chat endpoint instead of the Text completion endpoint. Because of this, sample apps like the Book Creator App calling it get the following:

Invalid request: The request is not valid, HTTP status: 404 - Detail: This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions? Status: 404 (Not Found)

Content: { "error": { "message": "This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?", "type": "invalid_request_error", "param": "model", "code": null } }

Headers: Date: Thu, 01 Jun 2023 16:56:14 GMT Connection: keep-alive Access-Control-Allow-Origin: REDACTED openai-organization: REDACTED openai-processing-ms: REDACTED openai-version: REDACTED Strict-Transport-Security: REDACTED x-ratelimit-limit-requests: REDACTED x-ratelimit-limit-tokens: REDACTED x-ratelimit-remaining-requests: REDACTED x-ratelimit-remaining-tokens: REDACTED x-ratelimit-reset-requests: REDACTED x-ratelimit-reset-tokens: REDACTED X-Request-ID: REDACTED CF-Cache-Status: REDACTED Server: cloudflare CF-RAY: REDACTED Alt-Svc: REDACTED Content-Type: application/json Content-Length: 236

This bug was probably introduced in PR #1279. SemanticKernelFactory.cs line 49 should call AddOpenAITextCompletionService() instead of AddOpenAIChatCompletionService(). Line 54 should call AddAzureTextCompletionService() instead of AddAzureChatCompletionService()

craigomatic commented 1 year ago

We updated the samples to use chat completion models, if you use gpt35 turbo or gpt4 you should not see this error.

andreas-deruiter commented 1 year ago

OK, got it, thank you!