Open kirnap opened 1 year ago
Any more findings on this yet?
Not from my end
Most likely something to do with the underlying HF transformers package. It's a lot of finger pointing, but still no resolution at this point unfortunately. Relevant Github Issues: https://github.com/UKPLab/sentence-transformers/issues/2312 https://github.com/huggingface/transformers/issues/2401
I'm having the same issue, it tried manipulating other things like order or content of the batch, the only factor that affects this is the batch size.
Same here. I'm getting a different embedding for different batch_size. The embeddings start to differ from about the 7 decimal point.
Hi,
I recently discovered that
model.encode
method does not give exactly the same embedding for different batch_size values. However, they're still close when I play with atol (absolute tolerance). Is this an expected behaviour or something buggy?You may find minimal code snippet to replicate the conflicting embeddings:
This prints out the following results:
thanks in advance!