ZuInnoTe / hadoopcryptoledger

Hadoop Crypto Ledger - Analyzing CryptoLedgers, such as Bitcoin Blockchain, on Big Data platforms, such as Hadoop/Spark/Flink/Hive
Apache License 2.0
141 stars 51 forks source link

Reading blocks with sizes between 0x8000 and 0xFFFF is failing #40

Closed liorregev closed 6 years ago

liorregev commented 6 years ago

The problem resides in the EthereumUtil.java:294 where a 2-byte number is assumed to be signed. The first such block is 403419 (which I have attached) with the listHeader of F98B1FF9021AA008741F. 403419.zip

liorregev commented 6 years ago

Let me clarify, I would very much like to solve this myself and contribute the solution, I am simply not fluent enough with the ethereum structure. If you would like, I would love to get some guidance from you and commit a resolution for this issue.

jornfranke commented 6 years ago

Hi,

Thanks for reporting.

Sure I will tell you later today in more detail and see also if this is an issue. The first thing is to create a unit test that tests the correct behavior (see EthereumUtilTest for some inspiration) - of course it is expected to fail because there is a bug. I usually use in these unit tests a real block from the Ethereum blockchain. Then it is fixed and unit tests are run again to see if the test case does not fail anymore (and of course non of the others so that no regression is introduced). I got most of the information of the Etherum block structure from the Ethereum yellow paper.

Best regards

On 28. Nov 2017, at 13:45, Lior Regev notifications@github.com wrote:

Let me clarify, I would very much like to solve this myself and contribute the solution, I am simply not fluent enough with the ethereum structure. If you would like, I would love to get some guidance from you and commit a resolution for this issue.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

will be added to version 1.1.2

jornfranke commented 6 years ago

fix was included in version 1.1.2