gordicaleksa / Open-NLLB

Effort to open-source NLLB checkpoints.
MIT License
419 stars 37 forks source link

MinHash: benchmark memory, speed and accuracy with varying r and b #23

Open vienneraphael opened 1 year ago

vienneraphael commented 1 year ago

This issue concerns fuzzy deduplication of text pairs.

Find what's the tradeoff between memory, speed, accuracy when varying r and b. We need to find a way to use way less than 9k because it requires too much CPU resources.

CreativeSelf0 commented 11 months ago

fine-tuning the parameters rows per band and number of bands in MinHash depends heavily on the data and the specific use case, such as fuzzy deduplication of text pairs in your scenario. These parameters control the trade-off between accuracy and performance (speed and memory usage):

  1. Accuracy: Increasing r and b generally increases the accuracy of the MinHash similarity estimates, as more hash functions are utilized, capturing more aspects of the data.
  2. Speed: However, higher values of r and b can slow down the computation, as more hash functions need to be evaluated.
  3. Memory Usage: Likewise, more memory is required to store the additional hash values.

The ideal settings for r and b could vary depending on:

An empirical approach, where you run experiments with different values of r and b on a representative subset of your data, can be very informative. By analyzing how the performance metrics (speed, memory usage, and accuracy) change with different settings, you can better understand the trade-offs and find an optimal configuration for your specific use case.