Closed aminshahnazari closed 2 months ago
+1
I used the previous version of the Kernel Memory service, and everything worked smoothly:
kernelmemory/service:sha-07a1d0d
@alex521
"Endpoint": "xxx",
are you using a custom endpoint?
I have the same exception using a custom endpoint to LM Studio
i have the same bug when using a custom endpoint
As you can see from the stack trace, the error is thrown by OpenAI .NET library, which is an internal dependency of Semantic Kernel.
OpenAI.Embeddings.Embedding.
g__ThrowInvalidData|11_0()
LM Studio and similar projects attempt to mimic OpenAI service output, however, sometimes the output has minor differences that can lead to exceptions, for example if a type doesn't match, if a special token is missing, etc.
The previous version of KM used an older version of Semantic Kernel, which used different code/libraries/versions.
I would report the issue to LM Studio asking to check for differences and compatibility with OpenAI .NET library.
Context / Scenario
I am trying to use the KernelMemory embedding scenario with the basic configuration of the
kernelmemory/service
Docker container.Here is the configuration summery in
appsettings.Production.json
:What happened?
When I attempt to embed something or use the ask API, I encounter the following error in the container console:
System.FormatException: The input is not a valid Base64 string of encoded floats. at OpenAI.Embeddings.Embedding.<ConvertToVectorOfFloats>
Importance
I cannot use Kernel Memory
Platform, Language, Versions
Docker Image:
kernelmemory/service:sha-d6af98f
Relevant log output