Open Snikch63200 opened 1 month ago
The batch_size
parameter in the paperQA codebase is primarily used for processing text documents in batches during the embedding process. It determines how many documents are processed together in a single batch when calling the embedding function [1]. However, there is no specific information available in the codebase that directly links changes in batch_size
to variations in answer quality or speed for the openai/mixtral:8x7b
model [1].
The impact of batch_size
might not be noticeable in terms of answer quality and speed, especially if the bottleneck is elsewhere in the system or if the model and hardware can handle larger batches efficiently.
To continue talking to Dosu, mention @dosu.
Hello,
I changed batch size from 1 (default) to 8, 32 and saw no changes on paperQA behavioural (answer quality end speed), as follows :
Is batch size have any impact on paperQA behavioural ?