Closed giannik closed 1 year ago
its weird in the exception trace to see :
Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.InternalGetEmbeddingsAsync
since im not using AzureSdk
The method has the following signature:
public static KernelBuilder WithOpenAITextEmbeddingGenerationService(this KernelBuilder builder, string modelId, string apiKey, string? orgId = null, string? serviceId = null, bool setAsDefault = false, HttpClient? httpClient = null)
So, you're passing "text-embedding-ada-002"
as the second argument for apiKey
which is why it's not working. Change to the below and it should work:
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", llmProvider.OpenAIKey)
If the intention was to use ada
as the service ID you can adjust the call like so:
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", llmProvider.OpenAIKey, serviceId: "ada")
Hopefully that helps. Also, SK uses the Azure.AI.OpenAI package internally which is why you're seeing that namespace in the stack trace.
@anthonypuppo Thank you for response.
I tried your suggestion :
var kernel = Kernel.Builder
.WithLogger(_logger)
.WithOpenAITextCompletionService(llmProvider.Model, decryptedPassword)
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", llmProvider.OpenAIKey )
.WithMemoryStorage(new VolatileMemoryStore())
.Build();
but still got similar error message. I verified the openapi key is correct by running it against the chatcompletion service and it worked.
ErrorCode: invalid_api_key
Content: { "error": { "message": "Incorrect API key provided: CfDJ8PmP**EPBA. You can find your API key at https://platform.openai.com/account/api-keys.", "type": "invalid_request_error", "param": null, "code": "invalid_api_key" } }
Headers: Date: Fri, 21 Jul 2023 17:00:28 GMT Connection: keep-alive Vary: REDACTED X-Request-ID: REDACTED Strict-Transport-Security: REDACTED CF-Cache-Status: REDACTED Server: cloudflare CF-RAY: REDACTED Alt-Svc: REDACTED Content-Type: application/json; charset=utf-8 Content-Length: 448
at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken)
at Azure.AI.OpenAI.OpenAIClient.GetEmbeddingsAsync(String deploymentOrModelName, EmbeddingsOptions embeddingsOptions, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.RunRequestAsync[T](Func1 request) in C:\webapps\semantic-kernel\repo\dotnet\src\Connectors\Connectors.AI.OpenAI\AzureSdk\ClientBase.cs:line 374 --- End of inner exception stack trace --- at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.RunRequestAsync[T](Func
1 request) in C:\webapps\semantic-kernel\repo\dotnet\src\Connectors\Connectors.AI.OpenAI\AzureSdk\ClientBase.cs:line 411
at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.InternalGetEmbeddingsAsync(IList1 data, CancellationToken cancellationToken) in C:\webapps\semantic-kernel\repo\dotnet\src\Connectors\Connectors.AI.OpenAI\AzureSdk\ClientBase.cs:line 127 at Microsoft.SemanticKernel.AI.Embeddings.EmbeddingGenerationExtensions.GenerateEmbeddingAsync[TValue,TEmbedding](IEmbeddingGeneration
2 generator, TValue value, CancellationToken cancellationToken) in C:\webapps\semantic-kernel\repo\dotnet\src\SemanticKernel.Abstractions\AI\Embeddings\EmbeddingGenerationServiceExtensions.cs:line 29
at Microsoft.SemanticKernel.Memory.SemanticTextMemory.SaveInformationAsync(String collection, String text, String id, String description, String additionalMetadata, CancellationToken cancellationToken) in C:\webapps\semantic-kernel\repo\dotnet\src\SemanticKernel\Memory\SemanticTextMemory.cs:line 38
at Intelli.Embeddings.Search.VolatileSearch.Core.Services.VolatileQueryService.SearchAsync(String indexName, String query, LLMProvider provider) in C:\IntelliDev\src\IntelliCore\Intelli.Embeddings.Search.VolatileSearch.Core\Services\VolatileQueryService.cs:line 110
var kernel = Kernel.Builder
.WithLogger(_logger)
.WithOpenAITextCompletionService(llmProvider.Model, decryptedPassword)
.WithOpenAITextEmbeddingGenerationService("text-embedding-ada-002", llmProvider.OpenAIKey )
.WithMemoryStorage(new VolatileMemoryStore())
.Build();
You're providing two separate values for the API key in each of those calls (decryptedPassword
and llmProvider.OpenAIKey
). Since it looks like you're decrypting the key before use, I assume both should be decryptedPassword
?
Im embarrassed! That was it. I was using the unencrypted openai key. Thank you for opening my eyes.
When running the following code to use text embedding model (text-embedding-ada-002) I am getting the followig error : Incorrect API key provided: text-emb**-002. The error occurs when calling kernel.Memory.SaveInformationAsync()
The key and password are valid because when i try it without AITextEmbeddingGenerationService but with OpenAIChatCompletionService then it works . See detailed error message below.