Stratio / deep-spark

Connecting Apache Spark with different data stores [DEPRECATED]
http://stratio.github.io/deep-spark
Apache License 2.0
197 stars 42 forks source link

Deep integration with Spark version 1.4 #22

Closed austindsouza closed 8 years ago

austindsouza commented 9 years ago

Hi,

I was trying to integrate stratio-deep with spark 1.4.1, while doing this Stratio deep is compiling properly with Spark1.4.1 But while creating the distribution i am getting the error.

[INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Spark Project Networking 1.3.1 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-network-common_2.10 --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-network-common_2.10 --- [INFO] [INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark-network-common_2.10 --- [INFO] Add Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/scala [INFO] Add Test Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/test/scala [INFO] [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-network-common_2.10 --- [INFO] Source directory: /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/scala added. [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-network-common_2.10 --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-network-common_2.10 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /tmp/stratio-deep-distribution/stratiospark/network/common/src/main/resources [INFO] Copying 3 resources [INFO] [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-network-common_2.10 --- [INFO] Using zinc server for incremental compilation [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [info] Compiling 43 Java sources to /tmp/stratio-deep-distribution/stratiospark/network/common/target/scala-2.10/classes... [info] Error occurred during initialization of VM [info] java.lang.Error: Properties init: Could not determine current working directory. [info] at java.lang.System.initProperties(Native Method) [info] at java.lang.System.initializeSystemClass(System.java:1119) [info] [error] Compile failed at Aug 24, 2015 1:10:20 PM [0.056s] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 5.302 s] [INFO] Spark Project Networking ........................... FAILURE [ 0.783 s] [INFO] Spark Project Shuffle Streaming Service ............ SKIPPED [INFO] Spark Project Core ................................. SKIPPED [INFO] Spark Project Bagel ................................ SKIPPED [INFO] Spark Project GraphX ............................... SKIPPED [INFO] Spark Project Streaming ............................ SKIPPED [INFO] Spark Project Catalyst ............................. SKIPPED [INFO] Spark Project SQL .................................. SKIPPED [INFO] Spark Project ML Library ........................... SKIPPED [INFO] Spark Project Tools ................................ SKIPPED [INFO] Spark Project Hive ................................. SKIPPED [INFO] Spark Project REPL ................................. SKIPPED [INFO] Spark Project YARN ................................. SKIPPED [INFO] Spark Project Assembly ............................. SKIPPED [INFO] Spark Project External Twitter ..................... SKIPPED [INFO] Spark Project External Flume Sink .................. SKIPPED [INFO] Spark Project External Flume ....................... SKIPPED [INFO] Spark Project External MQTT ........................ SKIPPED [INFO] Spark Project External ZeroMQ ...................... SKIPPED [INFO] Spark Project External Kafka ....................... SKIPPED [INFO] Spark Project Examples ............................. SKIPPED [INFO] Spark Project YARN Shuffle Service ................. SKIPPED [INFO] Spark Project External Kafka Assembly .............. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 7.636 s [INFO] Finished at: 2015-08-24T13:10:20+05:30 [INFO] Final Memory: 47M/318M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-network-common_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :spark-network-common_2.10 Cannot make Spark distribution

Meanwhile, I saw that Spark version 1.5 is released, so when can we expect this integration.

Thanks & regards.

mafernandez-stratio commented 8 years ago

Hi Austin,

Currently, the development of Stratio Deep is deprecated. However, Stratio Crossdata has inherited part of the features of Stratio Deep. Please, visit link in order to get more information.

Regards

austindsouza commented 8 years ago

Hello Miguel,

Thanks for your reply.... Does the new Crossdata also have Elasticsearch integrated with that?

Regards On 25-Jan-2016 10:42 pm, "Miguel Angel Fernandez Diaz" < notifications@github.com> wrote:

Hi Austin,

Currently, the development of Stratio Deep is deprecated. However, Stratio Crossdata has inherited part of the features of Stratio Deep. Please, visit link https://stratio.atlassian.net/wiki/display/CROSSDATA1x0/Home in order to get more information.

Regards

— Reply to this email directly or view it on GitHub https://github.com/Stratio/deep-spark/issues/22#issuecomment-174589220.

mafernandez-stratio commented 8 years ago

Hi Austin,

Crossdata can be used with any of the datasources of Spark and it optimises the access to Cassandra, MongoDB and Elasticsearch. More information about Crossdata connectors here.

Regards