ZuInnoTe / hadoopcryptoledger

Hadoop Crypto Ledger - Analyzing CryptoLedgers, such as Bitcoin Blockchain, on Big Data platforms, such as Hadoop/Spark/Flink/Hive
Apache License 2.0
141 stars 51 forks source link

Is there a way around the currently supported maximum ethereum block size ? #52

Closed skattoju-zz closed 6 years ago

skattoju-zz commented 6 years ago

We are running into exceptions when attempting to do graph analysis using graphframes after creating a dataframe using spark-hadoopcryptoledger-ds. Seems like there are blocksizes bigger than what can fit inside an int ?

java.lang.InterruptedException: org.zuinnote.hadoop.ethereum.format.exception.EthereumBlockReadException: Error: This block size cannot be handled currently (larger then largest number in positive signed int)

jornfranke commented 6 years ago

Hmm this seems to be something different. Valid ethereum blocks that I have seen are never larger than 2 GB. Do you have the block number with which it occurs? It looks more like an invalid or incomplete block (or some recent changes to the block structure in Ethereum).

On 17. Mar 2018, at 22:57, Siddhartha Kattoju notifications@github.com wrote:

We are running into exceptions when attempting to do graph analysis using graphframes after creating a dataframe using spark-hadoopcryptoledger-ds. Seems like there are blocksizes bigger than what can fit inside an int ?

java.lang.InterruptedException: org.zuinnote.hadoop.ethereum.format.exception.EthereumBlockReadException: Error: This block size cannot be handled currently (larger then largest number in positive signed int)

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

Do you have more context of the code?

On 17. Mar 2018, at 22:57, Siddhartha Kattoju notifications@github.com wrote:

We are running into exceptions when attempting to do graph analysis using graphframes after creating a dataframe using spark-hadoopcryptoledger-ds. Seems like there are blocksizes bigger than what can fit inside an int ?

java.lang.InterruptedException: org.zuinnote.hadoop.ethereum.format.exception.EthereumBlockReadException: Error: This block size cannot be handled currently (larger then largest number in positive signed int)

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

It could have been due to some changes by somebody else related to the function EthereumUtil.convertIndicatorToRLPSize

On 17. Mar 2018, at 22:57, Siddhartha Kattoju notifications@github.com wrote:

We are running into exceptions when attempting to do graph analysis using graphframes after creating a dataframe using spark-hadoopcryptoledger-ds. Seems like there are blocksizes bigger than what can fit inside an int ?

java.lang.InterruptedException: org.zuinnote.hadoop.ethereum.format.exception.EthereumBlockReadException: Error: This block size cannot be handled currently (larger then largest number in positive signed int)

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

Maybe there is some other file in your folder which is not related to ethereum data ?

On 17. Mar 2018, at 22:57, Siddhartha Kattoju notifications@github.com wrote:

We are running into exceptions when attempting to do graph analysis using graphframes after creating a dataframe using spark-hadoopcryptoledger-ds. Seems like there are blocksizes bigger than what can fit inside an int ?

java.lang.InterruptedException: org.zuinnote.hadoop.ethereum.format.exception.EthereumBlockReadException: Error: This block size cannot be handled currently (larger then largest number in positive signed int)

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

skattoju-zz commented 6 years ago

That was it there was a rogue file in my folder! Sorry about the false alarm!