HCADatalab / powderkeg

Live-coding the cluster!
Eclipse Public License 1.0
159 stars 23 forks source link

JDK required #5

Open ghost opened 7 years ago

ghost commented 7 years ago

nREPL server started on port 43876 on host 127.0.0.1 - nrepl://127.0.0.1:43876
REPL-y 0.3.7, nREPL 0.2.12
Clojure 1.8.0
OpenJDK 64-Bit Server VM 1.8.0_111-8u111-b14-2ubuntu0.16.04.2-b14
    Docs: (doc function-name-here)
          (find-doc "part-of-name-here")
  Source: (source function-name-here)
 Javadoc: (javadoc java-object-or-class-here)
    Exit: Control+D or (exit) or (quit)
 Results: Stored in vars *1, *2, *3, an exception in *e

user=> (require '[powderkeg.core :as keg])
Preparing for self instrumentation.

CompilerException java.lang.ClassNotFoundException: com.sun.tools.attach.VirtualMachine, compiling:(powderkeg/ouroboros.clj:150:1) 
user=> ```
ghost commented 7 years ago

On Ubuntu you must additionally install openjdk-8-jdk-headless.

ghost commented 7 years ago

In retrospect this is highly undesirable. Is there some way to download tools.jar by specifying a dependency?

cgrand commented 7 years ago

tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar to spark-submit (and modify how powderkeg locate tools.jar). Would it be enough for you?

ghost commented 7 years ago

Ideally we could use tools.jar as a maven dependency. That way we are not dictating infrastructure.

On Dec 8, 2016 12:45 PM, "Christophe Grand" notifications@github.com wrote:

tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar to spark-submit (and modify how powderkeg locate tools.jar). Would it be enough for you?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265731793, or mute the thread https://github.com/notifications/unsubscribe-auth/ADCn-MXNMM7E7EGyj6wDK3ZVp4R1KAjPks5rF_vcgaJpZM4LHuIU .

cgrand commented 7 years ago

Well tools.jar is tightly couple to the jdk/jre (version and platform). If you can run maven you have a jdk (else no javac), so it would be akin doing a cross-compilation (fetching the tools.jar matching the JRE deployed on the cluster) and I don't know of a repo where tools.jar is available. Do you?

On Thu, Dec 8, 2016 at 2:22 PM, Brian Mingus notifications@github.com wrote:

Ideally we could use tools.jar as a maven dependency. That way we are not dictating infrastructure.

On Dec 8, 2016 12:45 PM, "Christophe Grand" notifications@github.com wrote:

tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar to spark-submit (and modify how powderkeg locate tools.jar). Would it be enough for you?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265731793 , or mute the thread https://github.com/notifications/unsubscribe-auth/ADCn- MXNMM7E7EGyj6wDK3ZVp4R1KAjPks5rF_vcgaJpZM4LHuIU .

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265738361, or mute the thread https://github.com/notifications/unsubscribe-auth/AAC3sZj1hPgzSdencKLtBjh0v9Vj2AIUks5rGASWgaJpZM4LHuIU .

-- On Clojure http://clj-me.cgrand.net/ Clojure Programming http://clojurebook.com Training, Consulting & Contracting http://lambdanext.eu/

ghost commented 7 years ago

I see what you mean. While it seems theoretically possible to detect the appropriate version and to dynamically add it to the dependencies, I see no attempts to do that out there.

cgrand commented 7 years ago

Also note that tools.jar is required only on the driver node, not on the workers.

On Thu, Dec 8, 2016 at 4:06 PM, Brian Mingus notifications@github.com wrote:

I see what you mean. While it seems theoretically possible to detect the appropriate version and to dynamically add it to the dependencies, I see no attempts to do that out there.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265762402, or mute the thread https://github.com/notifications/unsubscribe-auth/AAC3sXBjqkmEJViCqOKRJuKc5NpRWddTks5rGB0TgaJpZM4LHuIU .

-- On Clojure http://clj-me.cgrand.net/ Clojure Programming http://clojurebook.com Training, Consulting & Contracting http://lambdanext.eu/

shzhng commented 7 years ago

does this issue only apply to when you need a repl environment for using powderkeg? correct me if i'm wrong in assuming it should be a non issue if deploying an AOT that simply uses powderkeg as a spark clojure API?

cgrand commented 7 years ago

@shzhng you are correct, powderkeg could be evolved to not need ouroboros (which is the one requiring tools.jar) when AOT compiled. However at the moment it's started in all cases.

shzhng commented 7 years ago

@cgrand that would probably be the ideal case. need to crawl the code a bit more to understand