nvdla / sw

NVDLA SW
Other
489 stars 193 forks source link

No support for tensorflow? #147

Open BigFatFlo opened 5 years ago

BigFatFlo commented 5 years ago

Hello,

I was happy to learn that an nvdla_small compatible compiler version has been released, but I'm surprised that it still can't handle tensorflow models as inputs. Is that feature still in the pipe or has it been abandoned? There are virtually no good converters to reliably go from tensorflow to caffe+prototxt so it's a real problem, especially since caffe isn't such a popular framework anymore.

Thank you.

fisherxue commented 5 years ago

Is it possible to use TensorRT for your application?

https://devtalk.nvidia.com/default/topic/1043020/jetson-agx-xavier/running-tensorflow-program-on-igpu-dla-solved-/post/5290874/#5290874

Indicates that TensorRT should allow you to run your Tensorflow models on DLA.

BigFatFlo commented 5 years ago

Unfortunately no I can't. We're developing a custom embedded application. What I don't understand is that if TensorRT actually allows Tensorflow models to run on DLA, then the compiler with tensorflow models as input exists, so why isn't it released?