addthis / stream-lib

Stream summarizer and cardinality estimator.
Apache License 2.0
2.26k stars 556 forks source link

BloomFilter doesn't work when amount of bits > 2^32 #163

Open ajtkulov opened 4 years ago

ajtkulov commented 4 years ago

In my live case, we need BloomFilter for a bigger amount (about 4-5Gb ram, >32B bits)

The code is dependent on java.util.BitSet with .ctor

public BitSet(int nbits) with a lot of getter/setter by int index.

abramsm commented 4 years ago

Can you partition your data in some way so that you can than use a list of n smaller BloomFilters to hold each chunk of data?

ajtkulov commented 4 years ago

it would work, but as a drawback, you have to query each chunk for contains method. So, complexity increases by N (chunk amount). Add method also becomes more sophisticated.

Actually, I have a "solution" (I'm not a java native, so excuse me in advance). There is a fork, there is a lot of copy-paste code: own BitSet with long type and BloomFilter hierarchy. https://github.com/ajtkulov/stream-lib It works, but I'm not sure that this is idiomatic for java and enough for PR.

vitusya commented 4 years ago

why complexity increase by N chunks? you need partition your data by hash, index_of_bloom = long_hash % number_of_chunks, complexity is same, memory complexity is num_chunks * bloomfilter_memory

ajtkulov commented 4 years ago

@vitusya agree, thank you, it's much much easier.

The only theoretical issue is that (in case you build Bloom from dynamic data) some malicious person could generate items related to the same bucket. But it's related to all hashes structures, and it's a theory, not a practical thing.