Currently, the Generic OpenAI Embedder doesn't offer a way to specify a maximum number of concurrent chunks for embedding and always defaults to 500. This limits its usability with OpenAI compatible embedders that may have limitations on chunk batch size. Currently trying to use an embedder that has a batch size limitation of 32 and any documents with chunks greater than that result in a 413 error.
Request:
Add an optional maxConcurrentChunks parameter to the Generic OpenAI Embedder UI, allowing users to control the maximum number of chunks processed concurrently. This will allow embedders that enforce concurrent chunk size restrictions to work.
What would you like to see?
Description:
Currently, the Generic OpenAI Embedder doesn't offer a way to specify a maximum number of concurrent chunks for embedding and always defaults to 500. This limits its usability with OpenAI compatible embedders that may have limitations on chunk batch size. Currently trying to use an embedder that has a batch size limitation of 32 and any documents with chunks greater than that result in a 413 error.
Request:
Add an optional maxConcurrentChunks parameter to the Generic OpenAI Embedder UI, allowing users to control the maximum number of chunks processed concurrently. This will allow embedders that enforce concurrent chunk size restrictions to work.