deepjavalibrary / djl

An Engine-Agnostic Deep Learning Framework in Java
https://djl.ai
Apache License 2.0
4.07k stars 650 forks source link

create a custom operator [In MXNet, CustomOp] that supports autograd in Java or C++ #140

Open tribbloid opened 4 years ago

tribbloid commented 4 years ago

Description

This feature should be similar to the following 2 examples achieved using only Pythonn API [1][2]

The feature will introduce the capability of defining new differentiable operators for scalar and NDArrays in Java or C++. Many operators cannot be synthesised using existing NDArray API (e.g. sine, cosine, discrete Fourier transform) despite that they are used frequently in production and are well know autogradable operators.

Will this change the current api? How?

I have only read the MXNet backend. My impression at the moment is that existing NDArray API prioritise being engine agnostic, most of which are generated from c++ code. This seems to indicate that the jnarator framework should be exposed as a compiler-level plugin to advanced users, who will write C++ implementatios & headers for new functions, and dynamically inject them into a DJL abstraction that yields NDArray as an output.

Who will benefit from this enhancement?

Research scientists who handcraft autograd kernels, ML engineers who frequently uses DFT layers for feature extraction, data augmentation, rotation invariance. Performance optimisation engineers who like to accelerate conv layer by winograd.

References

[1] https://mxnet.apache.org/versions/1.6/api/python/docs/tutorials/extend/customop.html

[2] https://pytorch.org/docs/stable/notes/extending.html

[3] https://github.com/apache/incubator-mxnet/issues/12045

jameszow2 commented 4 years ago

Thanks for you

frankfliu commented 4 years ago

@tribbloid We will consider to priorities this feature.

tribbloid commented 4 years ago

Thanks a lot! Also thank @lanking520 for introducing this amazing work!