It can't be changed, even if I want to use float format. I want to use this client for text-embeddings-inference, which currently does not support the encoding_format parameter.
This results in the following error
The input is not a valid Base64 string of encoded floats.
I know that encoding_format compatibility would be a better approach in other projects, but a lot of compatible openai api's don't update as fast as they should.
Is it possible to allow users to change the encoding_format value?
Of course, as the official SDK of OpenAI, I would respect it if it was only compatible with OpenAI.
For now I can serialize it myself using protocol methods.
Currently
EmbeddingClient
fixes theencoding_format
value to base64 for better performance.https://github.com/openai/openai-dotnet/blob/45fc4d72c12314aea83264ebe2e1dc18870e5c06/src/Custom/Embeddings/EmbeddingGenerationOptions.cs#L77-L82
https://github.com/openai/openai-dotnet/blob/45fc4d72c12314aea83264ebe2e1dc18870e5c06/src/Custom/Embeddings/Embedding.cs#L75-L84
It can't be changed, even if I want to use float format. I want to use this client for text-embeddings-inference, which currently does not support the
encoding_format
parameter. This results in the following errorI know that
encoding_format
compatibility would be a better approach in other projects, but a lot of compatible openai api's don't update as fast as they should.Is it possible to allow users to change the
encoding_format
value? Of course, as the official SDK of OpenAI, I would respect it if it was only compatible with OpenAI.For now I can serialize it myself using protocol methods.