ZuInnoTe / hadoopcryptoledger

Hadoop Crypto Ledger - Analyzing CryptoLedgers, such as Bitcoin Blockchain, on Big Data platforms, such as Hadoop/Spark/Flink/Hive
Apache License 2.0
141 stars 51 forks source link

Scala-Spark Datasource Bitcoin Block - java.lang.NoSuchMethodError #61

Closed ghost closed 5 years ago

ghost commented 6 years ago

Hello,

I am trying to run the scala-spark-datasource-bitcoin on Spark 2.1.1, Scala 2.11.8 and sbt version 0.13.17, but i get this error:


    at org.zuinnote.spark.bitcoin.block.BitcoinBlockRelation.schema(BitcoinBlockRelation.scala:56)
    at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:40)
    at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:389)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:135)
    at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource$.sumTransactionOutputsJob(SparkScalaBitcoinBlockDataSource.scala:55)
    at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource$.main(SparkScalaBitcoinBlockDataSource.scala:46)
    at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource.main(SparkScalaBitcoinBlockDataSource.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)```
jornfranke commented 6 years ago

Do you have more details? Are you using the original application? What is your build.sbt? What is the original blockchain data that you are using (did you make sure that no rev*.dat files are in the input folder)? What is the result of sbt +it:test

On Sat, Sep 29, 2018 at 12:28 AM Thodoris Zois notifications@github.com wrote:

Hello,

I am trying to run the scala-spark-datasource-bitcoin on Spark 2.1.1, Scala 2.11.8 and sbt version 0.13.17, but i get this error:

at org.zuinnote.spark.bitcoin.block.BitcoinBlockRelation.schema(BitcoinBlockRelation.scala:56) at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:40) at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:389) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:135) at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource$.sumTransactionOutputsJob(SparkScalaBitcoinBlockDataSource.scala:55) at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource$.main(SparkScalaBitcoinBlockDataSource.scala:46) at org.zuinnote.spark.bitcoin.example.SparkScalaBitcoinBlockDataSource.main(SparkScalaBitcoinBlockDataSource.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)```

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ZuInnoTe/hadoopcryptoledger/issues/61, or mute the thread https://github.com/notifications/unsubscribe-auth/ABlR5NttaQRf66imgrEbkaygFG9_p6wKks5ufqKJgaJpZM4XAJkF .

ghost commented 6 years ago

Yeah, I am using the original application, just compiled it with sbt 0.13.17. The result of sbt +it:test was successful. I managed to make it work but by using hadoopcryptoledger-ds 1.1.2. This is my sbt now.


import Keys._
import scala._

lazy val root = (project in file("."))
.settings(
    name := "example-hcl-spark-scala-ds-bitcoinblock",
    version := "0.1"
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)
.enablePlugins(JacocoItPlugin)

crossScalaVersions := Seq("2.10.5", "2.11.7")

scalacOptions += "-target:jvm-1.7"

resolvers += Resolver.mavenLocal

fork  := true

assemblyJarName in assembly := "example-hcl-spark-scala-ds-bitcoinblock.jar"

assemblyMergeStrategy in assembly := {
 case PathList("META-INF", xs @ _*) => MergeStrategy.discard
 case x => MergeStrategy.first
}

libraryDependencies += "com.github.zuinnote" %% "spark-hadoopcryptoledger-ds" % "1.1.2" % "compile"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.1" % "provided"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"```
jornfranke commented 6 years ago

Thx for the Feedback - we recommend always the latest version due to fixes / changes in the blockchain. You should at least use sbt 1.1.1

Am 29.09.2018 um 15:54 schrieb Thodoris Zois notifications@github.com:

Yeah, I am using the original application, just compiled it with sbt 0.13.17. I managed to make it work but by using hadoopcryptoledger-ds 1.1.2. This is my sbt now. The result of sbt +it:test was successful.

import Keys. import scala.

lazy val root = (project in file(".")) .settings( name := "example-hcl-spark-scala-ds-bitcoinblock", version := "0.1" ) .configs( IntegrationTest ) .settings( Defaults.itSettings : _*) .enablePlugins(JacocoItPlugin)

crossScalaVersions := Seq("2.10.5", "2.11.7")

scalacOptions += "-target:jvm-1.7"

resolvers += Resolver.mavenLocal

fork := true

assemblyJarName in assembly := "example-hcl-spark-scala-ds-bitcoinblock.jar"

assemblyMergeStrategy in assembly := { case PathList("META-INF", xs @ _*) => MergeStrategy.discard case x => MergeStrategy.first }

libraryDependencies += "com.github.zuinnote" %% "spark-hadoopcryptoledger-ds" % "1.1.2" % "compile"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.1" % "provided"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"``` — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

ghost commented 6 years ago

The problem is that in the cluster we got Java 7 and with sbt 1.1.1 i get Error during sbt execution: java.lang.UnsupportedClassVersionError: scala/Option : Unsupported major.minor version 52.0

That is why I used 0.13.17..

jornfranke commented 6 years ago

You need to enforce then jdk7 scalacOptions += "-target:jvm-1.7" check also your dev environment java version should match the one in the cluster

Am 29.09.2018 um 16:05 schrieb Thodoris Zois notifications@github.com:

The problem is that in the cluster we got Java 7 and with sbt 1.1.1 i get Error during sbt execution: java.lang.UnsupportedClassVersionError: scala/Option : Unsupported major.minor version 52.0

That is why I used 0.13.17..

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.