Open antiagainst opened 3 years ago
Bumped into this issue trying to import a TF ResNet50 model. This is very frustrating and discouraging for new users like me, who know Pytorch but aren't familiar with TF, yet want to evaluate IREE's full performance (which isn't yet possible with pytorch models). IMHO the documented workflow (https://iree-org.github.io/iree/getting-started/tensorflow/) is too brittle and doesn't offer any hints on how to debug such problems.
Thanks for the feedback, I feel your frustration too :(
This sample (mentioned on this issue also) shows a bit of how to debug issues importing TF programs: https://colab.research.google.com/github/iree-org/iree/blob/main/samples/colab/tensorflow_hub_import.ipynb, and I agree that the entire workflow is brittle. Some of that can be improved with documentation and smarter tools.
The TensorFlow Lite path is more stable: https://iree-org.github.io/iree/getting-started/tflite/, and I'd recommend it for common model architectures like ResNet50.
Discovered a few sharp corners when refreshing IREE docs; listing them here so @silvasean can do magic. :)
1) Right now
iree-tf-import
requires manually specifying the SavedModel version. If the wrong version is given, then an emptymodule
op will be generated without any hints. This is not quite user friendly. We probably should automatically deduce the versions there; at least give the user an hint if the output is just an emptymodule
op.1) SavedModels may or may not contain serving signatures / entry points for IREE compilation flow. If we can have a way to dump all available serving signatures to verify that without requiring loading the SavedModel using Python it would be nice. If there are no serving signatures, then giving links to the user to help them re-export could save quite some head scratching time.