Open prhbrt opened 1 day ago
Hey @prhbrt! I'm here to help with any bugs, questions, or contributions you have. Let's tackle this together!
The discrepancy you've noted between the documentation and the code regarding the embed_batch_size
default value is indeed a valid concern. The documentation states that the default is 10, while the code sets it to 100. This inconsistency should be addressed to ensure clarity and accuracy in the documentation. It would be beneficial to update either the documentation or the code to reflect the correct default value.
To continue talking to Dosu, mention @dosu.
I think OpenAI is the exception here, because its batch API is very fast (you wouldn't want 100 batch size with a local embedding model for example, it'd eat all your GPU memory)
Bug Description
Yes: documentation claims default is 10
But: defaults to 100 in code
Version
github default branch
Steps to Reproduce
Open both urls
Relevant Logs/Tracbacks
No response