Open nhat416 opened 3 years ago
@nhat416 Sure, let me check it.
@rambabu-posa is that corrected?
@jgperrin Unfortunately I am not able to produce that problem at my end. see below output:
Rambabus-MacBook-Pro:net.jgp.books.spark.ch03 ram$ sbt clean assembly [info] Loading settings from idea.sbt ... [info] Loading global plugins from /Users/ram/.sbt/1.0/plugins [info] Updating {file:/Users/ram/.sbt/1.0/plugins/}global-plugins... [info] Done updating. [info] Loading settings from plugins.sbt ... [info] Loading project definition from /Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/project [info] Updating {file:/Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/project/}net-jgp-books-spark-ch03-build... [info] Done updating. [info] Loading settings from build.sbt ... [info] Set current project to SparkInAction2-Chapter03 (in build file:/Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/) [success] Total time: 0 s, completed 07-Jun-2021 23:04:25 [info] Updating {file:/Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/}net-jgp-books-spark-ch03... [info] Done updating. [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible: [warn] io.netty:netty-all:4.1.47.Final is selected over 4.0.23.Final [warn] +- org.apache.spark:spark-network-common_2.12:3.0.0 (depends on 4.1.47.Final) [warn] +- org.apache.spark:spark-core_2.12:3.0.0 (depends on 4.1.47.Final) [warn] +- org.apache.hadoop:hadoop-hdfs:2.7.4 (depends on 4.0.23.Final) [warn] com.google.code.findbugs:jsr305:3.0.0 is selected over {3.0.2, 1.3.9} [warn] +- org.apache.spark:spark-network-common_2.12:3.0.0 (depends on 3.0.0) [warn] +- org.apache.spark:spark-unsafe_2.12:3.0.0 (depends on 3.0.0) [warn] +- org.apache.spark:spark-core_2.12:3.0.0 (depends on 3.0.0) [warn] +- org.apache.hadoop:hadoop-common:2.7.4 (depends on 3.0.0) [warn] +- com.google.guava:guava:11.0.2 (depends on 1.3.9) [warn] +- org.apache.arrow:arrow-memory:0.15.1 (depends on 3.0.2) [warn] +- com.github.spotbugs:spotbugs-annotations:3.1.9 (depends on 3.0.2) [warn] * com.google.guava:guava:11.0.2 is selected over 16.0.1 [warn] +- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.7.4 (depends on 11.0.2) [warn] +- org.apache.hadoop:hadoop-yarn-api:2.7.4 (depends on 11.0.2) [warn] +- org.apache.hadoop:hadoop-yarn-common:2.7.4 (depends on 11.0.2) [warn] +- org.apache.hadoop:hadoop-hdfs:2.7.4 (depends on 11.0.2) [warn] +- org.apache.hadoop:hadoop-yarn-client:2.7.4 (depends on 11.0.2) [warn] +- org.apache.hadoop:hadoop-yarn-server-common:2.7.4 (depends on 11.0.2) [warn] +- org.apache.curator:curator-framework:2.7.1 (depends on 16.0.1) [warn] +- org.apache.curator:curator-client:2.7.1 (depends on 16.0.1) [warn] +- org.apache.hadoop:hadoop-common:2.7.4 (depends on 16.0.1) [warn] +- org.apache.curator:curator-recipes:2.7.1 (depends on 16.0.1) [warn] Run 'evicted' to see detailed eviction warnings [info] Compiling 8 Scala sources and 9 Java sources to /Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/target/scala-2.12/classes ... [info] Done compiling. [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Strategy 'discard' was applied to 338 files (Run the task at debug level to see details) [info] Strategy 'first' was applied to 30 files (Run the task at debug level to see details) [info] Packaging /Users/ram/SparkInAction2Ed/FinalCode/net.jgp.books.spark.ch03/target/scala-2.12/SparkInAction2-Chapter03-assembly-1.0.0.jar ... [info] Done packaging.
Found this issue while working through the chapter this morning. The problem was resolved for me by making sure I was using Java 8, not 11, during the build.
Although the use of Java 8 is mentioned at the beginning of the book, reiterating its use in the README files for all the labs might be a good idea, as at the end of 2021 and beginning of 2022, a lot of people might be switching between Java versions quite often.
hi @rambabu-posa I am getting the following error after I cd into the directory and run
sbt clean assembly
: