Closed kumavis closed 9 years ago
Never sure how to do a proper perf test as compiler optimizations are so dependant on context but, seems like it takes an additional 18% to serialize the binary buffer
var bin = Buffer('e01022dceb8c3202', 'hex') // bin.length => 8
var str = Buffer('haaywurl', 'utf8') // str.length => 8
bin.toString() //=> '�\u0010"���2\u0002'
str.toString() //=> 'haaywurl'
var iterations = 1000000
console.log( test(bin, iterations)[1]/iterations )
console.log( test(str, iterations)[1]/iterations )
function test(buf, iterations){
var start = process.hrtime()
for(var i=0; i<iterations; i++) {
buf.toString()
}
return process.hrtime(start)
}
╭─{ } kumavis in ~/Development/Node/merkle-patricia-tree on (master✱)
╰─± node buf-test.js
419.518481
356.506166
╭─{ } kumavis in ~/Development/Node/merkle-patricia-tree on (master✱)
╰─± node
> 419.518481 / 356.506166
1.1767495796973115
Hmm I think keyEncoding is a levelup thing, so not sure why i filed this here
if keys are buffers, they are coerced to utf8 strings. utf8 strings dont support arbitrary binary data (e.g. hashes). invalid utf8 byte sequences are replaced with the diamond-question mark character, potentially causing unintended key collisions. I also imagine the subsequent throw-catch ruins performance.