Open ghost opened 7 years ago
On Ubuntu you must additionally install openjdk-8-jdk-headless
.
In retrospect this is highly undesirable. Is there some way to download tools.jar by specifying a dependency?
tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar
to spark-submit
(and modify how powderkeg locate tools.jar).
Would it be enough for you?
Ideally we could use tools.jar as a maven dependency. That way we are not dictating infrastructure.
On Dec 8, 2016 12:45 PM, "Christophe Grand" notifications@github.com wrote:
tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar to spark-submit (and modify how powderkeg locate tools.jar). Would it be enough for you?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265731793, or mute the thread https://github.com/notifications/unsubscribe-auth/ADCn-MXNMM7E7EGyj6wDK3ZVp4R1KAjPks5rF_vcgaJpZM4LHuIU .
Well tools.jar is tightly couple to the jdk/jre (version and platform). If you can run maven you have a jdk (else no javac), so it would be akin doing a cross-compilation (fetching the tools.jar matching the JRE deployed on the cluster) and I don't know of a repo where tools.jar is available. Do you?
On Thu, Dec 8, 2016 at 2:22 PM, Brian Mingus notifications@github.com wrote:
Ideally we could use tools.jar as a maven dependency. That way we are not dictating infrastructure.
On Dec 8, 2016 12:45 PM, "Christophe Grand" notifications@github.com wrote:
tools.jar is part of JDK only and not shipped with JRE. I don't know an official way to get it. So I assume that you are stuck with a JRE on your nodes. A possibility would be to have in local the tools.jar of the JDK matching the nodes JRE and pass -jars nodes-tools.jar to spark-submit (and modify how powderkeg locate tools.jar). Would it be enough for you?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265731793 , or mute the thread https://github.com/notifications/unsubscribe-auth/ADCn- MXNMM7E7EGyj6wDK3ZVp4R1KAjPks5rF_vcgaJpZM4LHuIU .
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265738361, or mute the thread https://github.com/notifications/unsubscribe-auth/AAC3sZj1hPgzSdencKLtBjh0v9Vj2AIUks5rGASWgaJpZM4LHuIU .
-- On Clojure http://clj-me.cgrand.net/ Clojure Programming http://clojurebook.com Training, Consulting & Contracting http://lambdanext.eu/
I see what you mean. While it seems theoretically possible to detect the appropriate version and to dynamically add it to the dependencies, I see no attempts to do that out there.
Also note that tools.jar is required only on the driver node, not on the workers.
On Thu, Dec 8, 2016 at 4:06 PM, Brian Mingus notifications@github.com wrote:
I see what you mean. While it seems theoretically possible to detect the appropriate version and to dynamically add it to the dependencies, I see no attempts to do that out there.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/HCADatalab/powderkeg/issues/5#issuecomment-265762402, or mute the thread https://github.com/notifications/unsubscribe-auth/AAC3sXBjqkmEJViCqOKRJuKc5NpRWddTks5rGB0TgaJpZM4LHuIU .
-- On Clojure http://clj-me.cgrand.net/ Clojure Programming http://clojurebook.com Training, Consulting & Contracting http://lambdanext.eu/
does this issue only apply to when you need a repl environment for using powderkeg? correct me if i'm wrong in assuming it should be a non issue if deploying an AOT that simply uses powderkeg as a spark clojure API?
@shzhng you are correct, powderkeg could be evolved to not need ouroboros (which is the one requiring tools.jar) when AOT compiled. However at the moment it's started in all cases.
@cgrand that would probably be the ideal case. need to crawl the code a bit more to understand