Closed shunjikonishi closed 2 years ago
This happens on latest v14. On v14.2, I had for one particular function 2 million counts via Benchmark js, on latest v14 (v14.11), same functions runs 198k times. What happened? not only that, but also, when you replace a deeply nested value within a collection, seems 8 to 12 times slower. Is this as intended?
Any update on this issue? I've updated the node.js version on my Swarm cluster and it scaled up to 8 times more containers that I thought that I've been DDOS'ed. But was only Node being slower.
Seems to be array-related. The difference between the cases seems to be in how many tokens are emitted in each batch.
Reduced testcase:
const total = 200000
const batch = 32768
const tokens = []
console.log(process.version)
console.time()
for (let t = 0; t < total; t += batch) {
for (let i = 0; i < batch; i++) tokens.push(0)
for (let i = 0; i < batch; i++) tokens.splice(0, 1)
}
console.timeEnd()
v10.22.1
default: 36.884ms
v12.18.4
default: 2163.878ms
v14.12.0
default: 2.470s
Looks like an upstream v8 issue.
Upd: reduced further.
~@ChALkeR the test case is not equivalent, on the original v14 is faster than v12, but still considerably slower than v10. I wonder if we're dealing with two issues here.~ Actually ignore that, I was running the original example without passing a file to it. Definitely a V8 bug.
Reproducible in Chromium 85.0.4183.102.
Testcase:
const total = 256000
for (let batch = 1000; batch < total; batch *= 2) {
const tokens = []
console.time(batch)
for (let t = 0; t < total; t += batch) {
for (let i = 0; i < batch; i++) tokens.push(0)
for (let i = 0; i < batch; i++) tokens.splice(0, 1)
}
console.timeEnd(batch)
}
Result in Chromium 85.0.4183.102:
1000: 27.48291015625 ms
2000: 27.8701171875 ms
4000: 28.880615234375 ms
8000: 52.298095703125 ms
16000: 111.510009765625 ms
32000: 737.64990234375 ms
64000: 5127.23095703125 ms
128000: 12488.43505859375 ms
Result in Node.js v14.12.0:
1000: 23.391ms
2000: 31.063ms
4000: 55.71ms
8000: 109.065ms
16000: 435.397ms
32000: 1.196s
64000: 1.200s
128000: 9.947s
Seems like ArraySplice builtin is considerably slower for whatever code path we're hitting.
v10
v12/v14 (will post only one but the flamegraphs are similar)
(interactive SVGs available at https://gist.github.com/mmarchini/5da624b8c62fe1ecfe4944726a3cf4c6 )
@ChALkeR do you want to open a bug upstream? Otherwise I can do it.
@mmarchini You can go for it if you want to. :smile_cat:
BTW I have a feeling there's already a bug for it :)
welp V8 bug tracker is broken.
@nodejs/v8 this seems like a regression upstream present since at least 7.4. I tried to open an issue in V8 bug tracker but it just reloads the page (no errors) when I click "Submit issue". Can you take a look and create the issue upstream?
Thanks @mmarchini, I created a bug in V8: https://crbug.com/v8/10962. I'll get back to you when I have something.
Still open on the V8 side, but I'll close the issue here as not actionable.
Version: v12.13.0 v12.16.2 v14.0.0
Platform: Darwin konishishunjinoMacBook-Pro.local 19.4.0 Darwin Kernel Version 19.4.0: Wed Mar 4 22:28:40 PST 2020; root:xnu-6153.101.6~15/RELEASE_X86_64 x86_64
Subsystem: None
What steps will reproduce the bug?
I'm making simple file handling program. It works with enough speed on v10.x env. However it sometimes more than 10 times slower on v12.x env.
Strangely, it doesn't depnd on file size. It can process 2MB file in 200ms. But other 400KB file takes 3500ms.
I've run it with
--prof
option. It say most of time are consumed on LazyCompile. So I doubt something bug in JIT.Reproduce program and detailed explanation is here. https://github.com/shunjikonishi/node-performance-problem
How often does it reproduce? Is there a required condition?
I've also sotred my testdata in above repo. It reproduces 100% on my env.
What is the expected behavior?
Hopefully same performance with v10.x.
Performance on v10.x
Performance on v12.x env