tjake / Jlama

Jlama is a modern LLM inference engine for Java
Apache License 2.0
499 stars 48 forks source link

Apple Silicon native compilation #40

Closed sydneypdx closed 2 months ago

sydneypdx commented 2 months ago

To make this work for me, I had to get the apple silicon version of jextract (Java 21) and add this to bin/jextract script:

JLINK_VM_OPTIONS=--enable-native-access=org.openjdk.jextract

and make sure the resulting library was "libjlama.dynlib" and in the java_library_path when running the app and comment out all of the test code in the main pom.xml (though I'm working off an older snapshot, so maybe the junits work now for Apple M2).

tjake commented 2 months ago

You shouldn't run the jexract script. That's just for development.

The jlama-native module builds the native lib.