Closed DutchRonin closed 4 years ago
Going to run it again later with debug option to get more information.
Hi, thanks for reporting this issue.
There is no file size limit for ChainStorage, but I can imagine what happend here. Right now many different client implementations signaling different proposals and I assume the blockparser rejected some blocks and it sees the longest chain until block height 476592. This will be fixed in the near future by making the parser segwit compatible, see: https://github.com/gcarq/rusty-blockparser/issues/13
Can you give me the latest processed block hash, or if possible the whole chain.json?
cheers, Michael
With debug active the parser finds the magic number at block 476592. I've attached screenshots of this event and end of initial scanning. [pictures in reverse order] Compressed chain.json is 16mb. Finding a way to share this.
Not a fan of dropbox and the like. Managed to upload the compressed chain.json on github to DutchRonin/DRGeneralFileSharing No idea how to get the latest processed block hash. (btw, before last scan I used bitcoin-qt to reindex the whole chain.) Hope this helps. Cheers, Robert
Thanks that helps a lot, I will take a look into it. The parsed blockchain until height 476591 looks valid, but block 47692 is ignored somehow. FYI: You can see the latest processed block hash from the debug output above:
DEBUG - chain: Longest chain:
height: 476591
newest_block: ...
genesis_block: ...
Since no one else confirmed having the same issue, I figured the problem could still be at my end. So I decided to delete the block dat-file 0940, let BT-QT rebuild and try again. Then the problem moved to file 0939. After also deleting this and some previous blockfiles for good measure and rebuilding, the parser couldn't finish anymore due to different expected result. So I may have to re-download the whole chain and try again. The problem may be caused by BT-qt writing magic numbers and not removing them during block updates. I'll get back as soon as everything's done which may take days.
Thanks for debugging this, did you figure out the root cause?
Guess it was a corrupt blockchain. Re-downloading the whole chain took forever. But after this it parsed completely without problems. Plan to run it again next week and try some statistics on the data. I don't have the bt-qt running all the time so will have to see if it parses again.
Should be fixed with the latest version
This tool is great and worked without issues till around a couple of weeks ago. When I use the option unspentcsvdump, I've noticed it doesn't parse beyond block 476592. In the initital blockchain scan it does go through the whole chain up to the current block, 479648 atm. But when it starts parsing it only goes up to block 476592. I've tested this on a virtualbox VM and physical machine running ubuntu. Built the current version of blockparser. Tried force re-index and resume options. What I see is that the temporary ChainStorage file chain.json doesn't seem to grow beyond 50MB in size. Could this be the cause? And if so, is this easy to fix in the code? I don't speak Rust and I looked through most of the src files but couldn't really find a size-limit somewhere. Thanks for any help, Robert