Closed jrabary closed 8 years ago
Interesting use case.
There shouldn't be any difference. Are you sure lmdbjni-linux64 is on classpath? You can try enable -verbose:class
when the JVM is started to verify the jar is actually loaded.
Here is what I get with -verbose:class
(modulo full path to jar file) :
[Loaded org.fusesource.lmdbjni.NativeObject from file:/*.jar] [Loaded org.fusesource.lmdbjni.Env from file:/.jar] [Loaded org.fusesource.lmdbjni.LMDBException from file:/.jar] [Loaded org.fusesource.lmdbjni.JNI from file:/.jar] [Loaded org.fusesource.hawtjni.runtime.Library from file:/.jar]
I don't see lmdbjni-linux64 but in my build.sbt I do have it .
I able to make it works by loading lmdbjni explicitly in my code with System.loadLibrary("lmdbjni")
. What does it mean ?
It probably means that sbt screw up your classpath somehow.
You could try debug-step through org.fusesource.hawtjni.runtime.Library.doLoad()
and see how mechanism for loading the library works. Usually it should just find the .so file under META-INF in the linux64 jar.
Would be interesting to hear why it wont load automatically.
In my build.sbt I have the following line :
assemblyMergeStrategy in assembly := {
case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
case PathList("org", "apache", xs @ _*) => MergeStrategy.last
case PathList("org", "jboss", xs @ _*) => MergeStrategy.last
case m if m.startsWith("META-INF") => MergeStrategy.discard
case _ => MergeStrategy.first
}
I don't really understand how it works but META-INF is discarded. Maybe it make sbt to screw up the classpath ?
Yes, probably. I haven't used sbt, but maybe its removed by the deduplication mechanism?
Did you manage to find the problem?
No, still searching for a solution.
Are you still searching? Otherwise I would like to close this issue.
Closing. Inactivity.
For the benefit of people who find this issue searching (like I did).
The problem is that Spark (or maybe Hadoop) has a dependency on org.fusesource.leveldbjni:leveldbjni-all:1.8
. This JAR contains an older version of the class org.fusesource.hawtjni.runtime.JNI
which does not contain the function getArchSpecifcResourcePath()
and does not try to load this varaiant of the LMDB library. (i.e. It will try linux but not linux64)
The solution is to shade these classes when performing the assembly stage. Add the following to your build.sbt, changing the UNIQUE part as necessary. The standard convention (I think?) is to use your company prefix, e.g. UNIQUE becomes com.mycompany. Or you can just leave it as-is.
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("org.fusesource.hawtjni.runtime.**" -> "shade.UNIQUE.org.fusesource.hawtjni.runtime.@1").inAll
)
Hi there,
I'm trying to use lmdbjni with spark, scala and sbt. My code works well when I run it inside idea. The problem comes when I build a jar of my code with "sbt assembly". At execution time I get the following error java.lang.UnsatisfiedLinkError: org.fusesource.lmdbjni.JNI.init() . I do have lmdbjni and lmdbjni-linux64 installed.