Closed xiaozhiliaoo closed 4 years ago
the huge key problem is defined by following:
the key is very huge, we can't load the key to memory one time. so we split the huge key to small keys by ValueIterableRdbVisitor
for example if we have a hash key like this
127.0.0.1:6379>hmset name field0 value0 field1 value1....field130 value130
name
field0, value0
field1, value1
.....
field130, value130
if we use ValueIterableRdbVisitor
to parse this key
Replicator r = new RedisReplicator("redis:///path/to/dump.rdb");
r.setRdbVisitor(new ValueIterableRdbVisitor(r));
r.addEventListener(new ValueIterableEventListener(new EventListener() {
@Override
public void onEvent(Replicator replicator, Event event) {
if (event instanceof BatchedKeyStringValueHash) {
BatchedKeyStringValueHash hash = (BatchedKeyStringValueHash)event;
int batch = hash.getBatch();
boolean isLast = hash.isLast();
byte[] key = hash.getKey();
Map<byte[], byte[]> value = hash.getValue();
}
}
}));
r.open();
we can see that the huge key to split every 64 elements(by default, you can change this parameter) small keys like following
------------------------------
batch : 0
isLast : false
key : name
value:
field0, value0
field1, value1
...
field63, value63
------------------------------
batch : 1
isLast : false
key : name
value:
field64, value64
field65, value65
...
field127, value127
------------------------------
batch : 2
isLast : true
key : name
value:
field128, value128
field129, value129
field130, value130
------------------------------
we load this small keys piece by piece, that can save our loaded memory.
note that only hash, set, zset, list
can split, string, stream, moudle
can't split
Confuse about huge key solution. 5.8. Handle huge key value pair,what is means? why HugeKVFileExample.java HugeKVSocketExample.java can solve huge key problem? can clarify the internals?