ZuInnoTe / hadoopcryptoledger

Hadoop Crypto Ledger - Analyzing CryptoLedgers, such as Bitcoin Blockchain, on Big Data platforms, such as Hadoop/Spark/Flink/Hive
Apache License 2.0
141 stars 51 forks source link

Bug while executing example "Analyzing the Ethereum Blockchain with Apache Flink" #58

Closed christopheblp closed 5 years ago

christopheblp commented 6 years ago

Hi everyone !

I followed the steps written in the wiki but I have an error when I execute the jar with Flink.

  1. I downloaded some Ethereum blockchain with geth and converted it to bin files.
  2. I put it on HDFS (Sandbox HDP 2.6)
  3. I built the example using the following command : sbt +clean +assembly +it:test. I do get the jar file in target directory
  4. I downloaded the flink tar.gz file here -> http://www.apache.org/dyn/closer.lua/flink/flink-1.5.0/flink-1.5.0-bin-scala_2.11.tgz (I choose the Without bundled Hadoop version)
  5. I run the jar with the following command : bin/flink run example-hcl-flink-scala-ethereumblock.jar --input hdfs://localhost:8020/user/ethereum/input --output hdfs://localhost:8020/user/ethereum/output

And I get this output :

Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR was set. Starting execution of program


The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error. at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020) at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096) at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096) Caused by: java.lang.RuntimeException: Cannot create Hadoop WritableTypeInfo. at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2155) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.countTotalTransactions(FlinkScalaEthereumBlockCounter.scala:41) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.main(FlinkScalaEthereumBlockCounter.scala:34) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter.main(FlinkScalaEthereumBlockCounter.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528) ... 9 more Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable at org.apache.flink.api.java.typeutils.WritableTypeInfo.(WritableTypeInfo.java:54) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2148) ... 17 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 23 more

Am I missing something ?

jornfranke commented 6 years ago

We have not tested it with 1.5.0 but it seems you are missing the Hadoop compability library for Flink 1.5: https://github.com/ZuInnoTe/hadoopcryptoledger/wiki/Analyzing-the-Ethereum-Blockchain-with-Apache-Flink

On 9. Jul 2018, at 11:19, Christophe notifications@github.com wrote:

Hi everyone !

I followed the steps written in the wiki but I have an error when I execute the jar with Flink.

I downloaded some Ethereum blockchain with geth and converted it to bin files. I put it on HDFS (Sandbox HDP 2.6) I built the example using the following command : sbt +clean +assembly +it:test. I do get the jar file in target directory I downloaded the flink tar.gz file here -> http://www.apache.org/dyn/closer.lua/flink/flink-1.5.0/flink-1.5.0-bin-scala_2.11.tgz (I choose the Without bundled Hadoop version) I run the jar with the following command : bin/flink run example-hcl-flink-scala-ethereumblock.jar --input hdfs://localhost:8020/user/ethereum/input --output hdfs://localhost:8020/user/ethereum/output And I get this output :

Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR was set. Starting execution of program

The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error. at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020) at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096) at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096) Caused by: java.lang.RuntimeException: Cannot create Hadoop WritableTypeInfo. at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2155) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.countTotalTransactions(FlinkScalaEthereumBlockCounter.scala:41) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.main(FlinkScalaEthereumBlockCounter.scala:34) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter.main(FlinkScalaEthereumBlockCounter.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528) ... 9 more Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable at org.apache.flink.api.java.typeutils.WritableTypeInfo.(WritableTypeInfo.java:54) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2148) ... 17 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 23 more

Am I missing something ?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

christopheblp commented 6 years ago

Yes I forgot to add the jar thanks, I also switched to Flink 1.2.0 but I still get an error :

java.lang.NoSuchMethodError: org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration()Lorg/apache/hadoop/conf/Configuration;

Do you think I need to have HDP 2.5 instead of 2.6 ?

jornfranke commented 6 years ago

It should work as you do it now. Do you use a specific Hadoop distribution ? It should work with any of them. The library is not bound to a specific distribution.

Do you use one of the examples provided by the library?

It should work with 1.5.0, but best if you can test with 1.3.0 or 1.2.0.

On 9. Jul 2018, at 15:04, Christophe notifications@github.com wrote:

Yes I forgot to add the jar thanks, I also switched to Flink 1.2.0 but I still get an error :

java.lang.NoSuchMethodError: org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration()Lorg/apache/hadoop/conf/Configuration;

Do you think I need to have HDP 2.5 instead of 2.5 ?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

christopheblp commented 6 years ago

I have HDP 2.6 with Hadoop 2.7.3.2.6.4.0-91 installed by default.

Yes the example named : scala-flink-ethereumblock

jornfranke commented 6 years ago

It should work without problem - let me investigate a little bit ...

On 9. Jul 2018, at 15:18, Christophe notifications@github.com wrote:

I have HDP 2.6 with Hadoop 2.7.3.2.6.4.0-91 installed by default.

Yes the example named : scala-flink-ethereumblock

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

Most likely you downloaded the Hadoop compability jar in a version that does not match your flink distribution. E.g. if you download flink 1.5 you need the hadoop compatibility 1.5 in the lib folder. If you download Flink 1.2 then you need to download the Hadoop compatibility 1.2 in the lib folder. However, I recommend a newer Flink distribution (at least 1.4) and adapt in the build.sbt the flink dependency acccordingly.

christopheblp commented 6 years ago

build.txt I downloaded the Hadoop compatibility 1.5 and Flink 1.5. But I have an error when I want to rebuild the project with sbt, I just changed the version of everything related to flink from 1.2 to 1.5 in build.sbt file.

org.apache.flink#flink-hadoop-compatibility_2.10;1.5.0: not found [warn] :: org.apache.flink#flink-scala_2.10;1.5.0: not found [warn] :: org.apache.flink#flink-clients_2.10;1.5.0: not found

I didn't find what to specifiy in order to get these dependencies.

(I attached the file with txt extension)

EDIT : I finally managed to generate the new jar (I deleted the reference to Scala version 2.10 in the build file ), but I still get an error when I run the flink example :

Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR was set. Starting execution of program


The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error. at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020) at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096) at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096) Caused by: java.lang.RuntimeException: Cannot create Hadoop WritableTypeInfo. at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2155) at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1759) at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1701) at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:956) at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:817) at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:771) at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:767) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.countTotalTransactions(FlinkScalaEthereumBlockCounter.scala:41) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.main(FlinkScalaEthereumBlockCounter.scala:34) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter.main(FlinkScalaEthereumBlockCounter.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528) ... 9 more Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable at org.apache.flink.api.java.typeutils.WritableTypeInfo.(WritableTypeInfo.java:54) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2148) ... 23 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 29 more

jornfranke commented 6 years ago

Flink 1.5 does not support scala 2.10 anymore you need to remove scala 2.10 in the crosscompile part of the build.sbt

On 10. Jul 2018, at 14:56, Christophe notifications@github.com wrote:

I downloaded the Hadoop compatibility 1.5 and Flink 1.5. But I have an error when I want to rebuild the project with sbt, I just changed the version of everything related to flink from 1.2 to 1.5 in build.sbt file.

org.apache.flink#flink-hadoop-compatibility_2.10;1.5.0: not found [warn] :: org.apache.flink#flink-scala_2.10;1.5.0: not found [warn] :: org.apache.flink#flink-clients_2.10;1.5.0: not found

I didn't find what to specifiy in order to get these dependencies.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

christopheblp commented 6 years ago

I did that (See my edit), but I'm still encountering problems... (I updated the problem)

jornfranke commented 6 years ago

Just to be clear - sorry for asking - the last error happens when you run the sbt command? More specially sbt +it:test?

Then you may need to match the dependencies for the test to the Hadoop dependencies supported by Flink 1.5, is changing them from 2.2 to a newer version 2.7 or so I have to experiment.

Or do you try already to run it on your flink cluster using the Flink command?

On 10. Jul 2018, at 17:47, Christophe notifications@github.com wrote:

I did that (See my edit), but I'm still encountering problems... (I updated the problem)

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

jornfranke commented 6 years ago

I checked and i was able to do sbt +clean +it:test with the following build.sbt:

import sbt. import Keys. import scala._

lazy val root = (project in file(".")) .settings( name := "example-hcl-flink-scala-ethereumblock", version := "0.1" ) .configs( IntegrationTest ) .settings( Defaults.itSettings : _*) .enablePlugins(JacocoItPlugin)

crossScalaVersions := Seq("2.11.7")

resolvers += Resolver.mavenLocal

scalacOptions += "-target:jvm-1.8"

assemblyJarName in assembly := "example-hcl-flink-scala-ethereumblock.jar"

fork := true

libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-flinkdatasource" % "1.2.0" % "compile" libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.2.0" % "compile" // needed for certain functionality related to Ethereum in EthereumUtil libraryDependencies += "org.bouncycastle" % "bcprov-ext-jdk15on" % "1.58" % "compile" libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.5.0" % "provided"

libraryDependencies += "org.apache.flink" % "flink-shaded-hadoop2" % "1.5.0" % "provided"

// needed for writable serializer libraryDependencies += "org.apache.flink" %% "flink-hadoop-compatibility" % "1.5.0" % "compile"

libraryDependencies += "org.apache.flink" %% "flink-clients" % "1.5.0" % "it"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.2.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.2.0" % "it" classifier "" classifier "tests"

I will try later on a Flink 1.5 cluster

jornfranke commented 5 years ago

besides a cluster, i tried with the above build.sbt and the HDP 2.6.5 (newer then yours, but it does not matter). I downloaded the flink 1.5 distribution with hadoop 2.7. I added the hadoopcompatibility for 1.5 in the lib folder of the flink distribution. I started the local flink cluster using the script start-cluster.sh in the bin folder of the flink distribution. I ran the following command (you need to adapt path according to the folders that you have): ./flink-1.5.0/bin/flink run example-hcl-flink-scala-ethereumblock.jar --input hdfs://sandbox-hdp.hortonworks.com:8020/home/input/ethereum/ethgenesis --output hdfs://sandbox-hdp.horto nworks.com:8020/home/output/ethereum

I find the correct output in /home/output/ethereum

christopheblp commented 5 years ago

I was missing the flink distribution with hadoop thanks. I am able to generate the jar with the modifications perfectly. But I have a new error when I execute the jar with flink when I execute this command :

bin/flink run example-hcl-flink-scala-ethereumblock.jar --input hdfs://localhost:8020/user/ethereum/input --output hdfs://localhost:8020/user/ethereum/output

I don't understand where it comes, the directory for hdfs is ok, I have the hadoop compatibility jar in lib folder and I started the cluster with the script start-cluster.sh. I don't think this is related to dependencies this time.

Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR was set. Starting execution of program


The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result. at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:258) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:464) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:452) at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62) at org.apache.flink.api.scala.ExecutionEnvironment.execute(ExecutionEnvironment.scala:540) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.main(FlinkScalaEthereumBlockCounter.scala:35) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter.main(FlinkScalaEthereumBlockCounter.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020) at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096) Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph. at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$5(RestClusterClient.java:357) at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870) at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977) at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:214) at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760) at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929) at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Exception is not retryable. at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326) at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338) at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911) at java.util.concurrent.CompletableFuture$UniRelay.tryFire(CompletableFuture.java:899) ... 12 more Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Exception is not retryable. ... 10 more Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error.] at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326) at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338) at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:953) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926) ... 4 more Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error.] at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:225) at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:209) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952) ... 5 more

jornfranke commented 5 years ago

It should not be localhost if you work on a HDP vm. If you look in the log files you will see that it cannot resolve it.

You have to use instead of localhost the dns I Provided above. On older HDPs you had to replace localhost by sandbox.hortonworks.com

On 11. Jul 2018, at 11:06, Christophe notifications@github.com wrote:

I was missing the flink distribution with hadoop thanks. I am able to generate the jar with the modifications perfectly. But I have a new error when I execute the jar with flink when I execute this command :

bin/flink run example-hcl-flink-scala-ethereumblock.jar --input hdfs://localhost:8020/user/ethereum/input --output hdfs://localhost:8020/user/ethereum/output

I don't understand where it comes, the directory for hdfs is ok, I have the hadoop compatibility jar in lib folder and I started the cluster with the script start-cluster.sh. I don't think this is related to dependencies this time.

Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR was set. Starting execution of program

The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result. at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:258) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:464) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:452) at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62) at org.apache.flink.api.scala.ExecutionEnvironment.execute(ExecutionEnvironment.scala:540) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter$.main(FlinkScalaEthereumBlockCounter.scala:35) at org.zuinnote.flink.ethereum.example.FlinkScalaEthereumBlockCounter.main(FlinkScalaEthereumBlockCounter.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020) at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096) Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph. at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$5(RestClusterClient.java:357) at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870) at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977) at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:214) at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760) at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929) at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Exception is not retryable. at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326) at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338) at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911) at java.util.concurrent.CompletableFuture$UniRelay.tryFire(CompletableFuture.java:899) ... 12 more Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Exception is not retryable. ... 10 more Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error.] at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326) at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338) at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:953) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926) ... 4 more Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error.] at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:225) at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:209) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952) ... 5 more

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

christopheblp commented 5 years ago

Thanks a lot it's working now ! :+1:

jornfranke commented 5 years ago

Great thx for the feedback

On 11. Jul 2018, at 11:38, Christophe notifications@github.com wrote:

Thanks a lot it's working now ! 👍

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.