Open tipame opened 1 year ago
Thanks for pointing out these good enhancement including the previous one issue #2331 . Currently we don't have dynamic creations of jni yet; all functions need to be added statically. But it is straightforward. If you want, you can try to add these enhancement in a similar way as in #2332 . Basically it composes of:
./gradlew cleanJNI; \
./gradlew :engines:pytorch:pytorch-jni:clean; \
./gradlew :engines:pytorch:pytorch-native:clean; \
./gradlew :engines:pytorch:pytorch-native:compileJNI; \
\
./gradlew :engines:pytorch:pytorch-jni:clean :engines:pytorch:pytorch-jni:build; \
./gradlew :engines:pytorch:pytorch-engine:clean :engines:pytorch:pytorch-engine:build;
(This may be a little slow. Consider the Update2 below instead. )
In the above gradle command lines, the pytorch-jni:build
and pytorch-engine:build
are called. This will require completing javadocs as well, which is not unnecessary during experiments. See update2 or update3.
Update: the above commands have been tested in Linux and MacOS. Also alias gradle=./gradlew
.
Update2: The suggestion is to do the following instead:
./gradlew cleanJNI; \
./gradlew :engines:pytorch:pytorch-jni:clean; \
./gradlew :engines:pytorch:pytorch-native:clean; \
./gradlew :engines:pytorch:pytorch-native:compileJNI; \
\
./gradlew :engines:pytorch:pytorch-jni:clean :engines:pytorch:pytorch-jni:jar; \
./gradlew :engines:pytorch:pytorch-engine:clean :engines:pytorch:pytorch-engine:jar;
And then, the next necessary step is to "Rebuild Module djl.engines.pytorch.pytorch-jni
" and "Rebuild Module djl.engines.pytorch.pytorch-engine
". This action can be found by right-clicking the corresponding module directory in the Project tab if you are using IntelliJ IDE.
Update3: Sometimes, the method in update2 doesn't work in the sense that the updates in JNI remains ineffective after the steps there. Here is another workaround to test the JNI without having to complete all the java docs. You can create a clean git branch, update the JNI, and then do the commands above with pytorch-engine:build, ie
./gradlew cleanJNI; \
./gradlew :engines:pytorch:pytorch-jni:clean; \
./gradlew :engines:pytorch:pytorch-native:clean; \
./gradlew :engines:pytorch:pytorch-native:compileJNI; \
\
./gradlew :engines:pytorch:pytorch-jni:clean :engines:pytorch:pytorch-jni:build; \
./gradlew :engines:pytorch:pytorch-engine:clean :engines:pytorch:pytorch-engine:build;
Here you don't have to finish java docs. Then switch back to the original git branch, where the updates in JNI is still effective.
Appendix: In case the JNI is supposed to run with GPU, the compilation needs to be the following:
./gradlew :engines:pytorch:pytorch-native:cleanJNI ;\
./gradlew :engines:pytorch:pytorch-native:compileJNI -Pcu11
Document of api and unit tests.
Format and PR tests Run the following tasks
./gradlew fJ fC checkstyleMain checkstyleTest pmdMain pmdTest ;\
./gradlew verifyJava ;\
./gradlew test
You can maybe try this out first and we will assist.
Hello, i'm trying to test new added function for pytorch. I successfuly compiled native binaries (dlls for windows), but i can't find out how to package them to jar and publish to local maven repo. As i see it should be: pytorch-native-cpu:1.13.1:win-x86_64 (in my case). How can i build it and publish in local repo for testing?
Found cache dir with binaries in user home.
@tipame
djl_torch.dll
will copied to cached directory during DJL unit test. You need to run gradle clean
for pytorch-jni and pytorch-engine module to avoid using old version.cd pytorch-jni
# publish to build/repo folder
gradlew clean publish
# publish to maven local (.m2 folder)
gradlew clean pTML
@tipame Sorry I forgot to mention that the gradle
commands are the alias for ./gradlew
. This is now fixed. After successfully running the above commands, the JNI will be compiled and the updated local jar is also created. Then Java api can immediately call these JNI's. On the other hand, if there is issue, the Java api will probably run into linking issue or use the old jar file (pytorch-jni), and doesn't show the new feature.
Description
Hello, there are a lot of functions from torch package not implemented (wrapped) in DJL. For example batch matrix-matrix product operation: TORCH.BMM It's looks like it should be wrapped in to NDArrays class.
May be there is some way i can dynamically create jni mapping for required function?
References
https://pytorch.org/docs/stable/generated/torch.bmm.html#torch.bmm