After i figured out how to compile TFRT on windows... Next question - how to use it with real pretrained models...
I have several of them in tfjs and tf-lite formats.
Bazel build generated 2 executables: tfrt_translate and bef_executor.
First requires MLIR script. Second - BEF file. So I cannot inference my models directly.
But how can I convert my models to MLIR? Google had some troubles finding proper answers to this.
And without first step - I cannot use bef_executor...
After i figured out how to compile TFRT on windows... Next question - how to use it with real pretrained models... I have several of them in tfjs and tf-lite formats. Bazel build generated 2 executables: tfrt_translate and bef_executor. First requires MLIR script. Second - BEF file. So I cannot inference my models directly. But how can I convert my models to MLIR? Google had some troubles finding proper answers to this. And without first step - I cannot use bef_executor...
Any suggestions please?