Closed IIIBlueberry closed 3 years ago
Good question.
In short: yes it is on the roadmap. When will I get to it? I don't know yet.
Currently I'm focused on making pytorch backend running: https://github.com/artyom-beilis/pytorch_dlprim
Details:
I'm aware of pluggable device and I even built an example I found. However at this point it is way easier for me to work on pytorch integration:
TF pluggable device interface is poorly documented with lack of useful examples (but a single one - that BTW didn't work with nightly TF builds)
TF interface is C based and which makes it way more complicated to wrap any particular function. For example in pytorch adding FW and BW propogation for an activation function is matter of 24 lines:
https://github.com/artyom-beilis/pytorch_dlprim/blob/master/op.cpp#L1001
and 2 additional for registration https://github.com/artyom-beilis/pytorch_dlprim/blob/master/op.cpp#L1063 with C++ exceptions/errors being propagated up to python level without writing a single line of code.
While TF requires about 90 lines for FW propagation only: https://github.com/jzhoulon/community/blob/plugin_tutorial/rfcs/20200624-pluggable-device-for-tensorflow/sample/tensorflow_plugin/src/kernels/cpu/relu_op.cc
To do the same in TF I need to develop infrastructure that makes writing kernels easy in first place.
I still focus in dlprimitives on channels first tensor format (NCHW) that is used in most frameworks like pytorch, caffe, mxnet. That it yet another thing I need to develop, besides lager operator coverage, float16/bfloat16 support, better optimizations for intel/arm gpus etc.
So yes it is on the roadmap but I'm not sure I'm going to get to it soon. I hope once dlprimitives gains more popularity I'll have more contributors and it would be easier to create TF pluggable device based on dlprimitives
Hello, is there any plan to integrate dlprimitives to Tensorflow Pluggable device plug in? https://discuss.tensorflow.org/t/fpgas-for-plugable-device/2377/2