deephacks / lmdbjni

LMDB for Java
Apache License 2.0
204 stars 28 forks source link

java.lang.UnsatisfiedLinkError: org.fusesource.lmdbjni.JNI.init() #38

Closed jrabary closed 8 years ago

jrabary commented 9 years ago

Hi there,

I'm trying to use lmdbjni with spark, scala and sbt. My code works well when I run it inside idea. The problem comes when I build a jar of my code with "sbt assembly". At execution time I get the following error java.lang.UnsatisfiedLinkError: org.fusesource.lmdbjni.JNI.init() . I do have lmdbjni and lmdbjni-linux64 installed.

krisskross commented 9 years ago

Interesting use case.

There shouldn't be any difference. Are you sure lmdbjni-linux64 is on classpath? You can try enable -verbose:class when the JVM is started to verify the jar is actually loaded.

jrabary commented 9 years ago

Here is what I get with -verbose:class (modulo full path to jar file) :

[Loaded org.fusesource.lmdbjni.NativeObject from file:/*.jar] [Loaded org.fusesource.lmdbjni.Env from file:/.jar] [Loaded org.fusesource.lmdbjni.LMDBException from file:/.jar] [Loaded org.fusesource.lmdbjni.JNI from file:/.jar] [Loaded org.fusesource.hawtjni.runtime.Library from file:/.jar]

I don't see lmdbjni-linux64 but in my build.sbt I do have it .

jrabary commented 9 years ago

I able to make it works by loading lmdbjni explicitly in my code with System.loadLibrary("lmdbjni"). What does it mean ?

krisskross commented 9 years ago

It probably means that sbt screw up your classpath somehow.

You could try debug-step through org.fusesource.hawtjni.runtime.Library.doLoad() and see how mechanism for loading the library works. Usually it should just find the .so file under META-INF in the linux64 jar.

Would be interesting to hear why it wont load automatically.

jrabary commented 9 years ago

In my build.sbt I have the following line :

assemblyMergeStrategy in assembly := {
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("org", "jboss", xs @ _*) => MergeStrategy.last
  case m if m.startsWith("META-INF") => MergeStrategy.discard
  case _ => MergeStrategy.first
}

I don't really understand how it works but META-INF is discarded. Maybe it make sbt to screw up the classpath ?

krisskross commented 9 years ago

Yes, probably. I haven't used sbt, but maybe its removed by the deduplication mechanism?

krisskross commented 9 years ago

Did you manage to find the problem?

jrabary commented 9 years ago

No, still searching for a solution.

krisskross commented 8 years ago

Are you still searching? Otherwise I would like to close this issue.

krisskross commented 8 years ago

Closing. Inactivity.

Hitobat commented 7 years ago

For the benefit of people who find this issue searching (like I did).

The problem is that Spark (or maybe Hadoop) has a dependency on org.fusesource.leveldbjni:leveldbjni-all:1.8. This JAR contains an older version of the class org.fusesource.hawtjni.runtime.JNI which does not contain the function getArchSpecifcResourcePath() and does not try to load this varaiant of the LMDB library. (i.e. It will try linux but not linux64)

The solution is to shade these classes when performing the assembly stage. Add the following to your build.sbt, changing the UNIQUE part as necessary. The standard convention (I think?) is to use your company prefix, e.g. UNIQUE becomes com.mycompany. Or you can just leave it as-is.

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("org.fusesource.hawtjni.runtime.**" -> "shade.UNIQUE.org.fusesource.hawtjni.runtime.@1").inAll
)