Open BigFatFlo opened 5 years ago
Is it possible to use TensorRT for your application?
Indicates that TensorRT should allow you to run your Tensorflow models on DLA.
Unfortunately no I can't. We're developing a custom embedded application. What I don't understand is that if TensorRT actually allows Tensorflow models to run on DLA, then the compiler with tensorflow models as input exists, so why isn't it released?
Hello,
I was happy to learn that an nvdla_small compatible compiler version has been released, but I'm surprised that it still can't handle tensorflow models as inputs. Is that feature still in the pipe or has it been abandoned? There are virtually no good converters to reliably go from tensorflow to caffe+prototxt so it's a real problem, especially since caffe isn't such a popular framework anymore.
Thank you.