telosnetwork / telosevm-translator

Apache License 2.0
2 stars 5 forks source link

Inconsistent data after a while on 2.0 runs #77

Closed guilledk closed 4 months ago

guilledk commented 5 months ago

Jesse and Amir ran into a data inconsistency/integrity problem on locally generated 2.0 data:

Amir Pasha, [6/8/24 6:39 PM] So here is what I found that can be a bug in translator. These are two consequent blocks 180728985 and 180728984:

Some(ExecutionPayloadV1 { parent_hash: 0x4f48afbe897471d316f39ca0066def23e61fbec1aa195fdce34116bb49665164, fee_recipient: 0x0000000000000000000000000000000000000000, state_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, logs_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, prev_randao: 0x0000000000000000000000000000000000000000000000000000000000000000, block_number: 180728985, gas_limit: 2147483647, gas_used: 0, timestamp: 1635278401, extra_data: 0x0ac5b4bc1f3be2bf803fb42acc3804eae1a3a6ceb1b61e5dc795867319019007, base_fee_per_gas: 0x0000000000000000000000000000000000000000000000000000000000000007_U256, block_hash: 0xfd357ea10d34bb88b1b29154b7e51adea1b90f8c291ea60b308a42fe6ef85563, transactions: [] })

Some(ExecutionPayloadV1 { parent_hash: 0x4f48afbe897471d316f39ca0066def23e61fbec1aa195fdce34116bb49665164, fee_recipient: 0x0000000000000000000000000000000000000000, state_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, logs_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, prev_randao: 0x0000000000000000000000000000000000000000000000000000000000000000, block_number: 180728984, gas_limit: 2147483647, gas_used: 0, timestamp: 1635278401, extra_data: 0x0ac5b4bc1f3be2bf803fb42acc3804eae1a3a6ceb1b61e5dc795867319019007, base_fee_per_gas: 0x0000000000000000000000000000000000000000000000000000000000000007_U256, block_hash: 0xfd357ea10d34bb88b1b29154b7e51adea1b90f8c291ea60b308a42fe6ef85563, transactions: [] })

Amir Pasha, [6/8/24 6:40 PM] They are exactly same, but with different block_number

Amir Pasha, [6/8/24 6:41 PM] And this is the exact point that reth stops syncing

Jesse | Telos, [6/8/24 6:41 PM] And that’s sooner than the one that’s wrong on the server

Jesse | Telos, [6/8/24 6:42 PM] This is an earlier block that is breaking for you locally than the one one the server, correct?

Jesse | Telos, [6/8/24 6:42 PM] Suggesting that this is a race condition, or another intermittent issue

Amir Pasha, [6/8/24 6:42 PM] Yes, I started everything from scratch. Built translator, generated arrow-data, used my own nodeos and ....

Jesse | Telos, [6/8/24 6:49 PM] So something like this, would you agree?

Jesse | Telos, [6/8/24 6:50 PM] My least favorite kind of bug to track down

Amir Pasha, [6/8/24 6:50 PM] Yes I do, it seems it happens on a random basis. So can be related to a race condition

guilledk commented 4 months ago

Couldnt reproduce, likely related to arrowbatch rust reader