Open ghost opened 1 year ago
Thanks for the query. Since there's no one on the other side of this issue, we're closing for now. Feel free to open a new issue if you'd like to bring it up again.
We can leave this open for now. I am trying to connect it to someone internally.
(But as mentioned on Discord, the IREE team is not planning to do work on this in 2023)
Sorry I was using a wrong account so I closed that account. This account is the POC for this FR.
Request description
My company is interested in experimenting and potentially adopting OpenXLA/IREE in our model inference stack. But one blocker is that all our major models have at least one TF custom op, so we can't experiment IREE easily. I hope a TensorFlowOpModule can be formally supported and a TF model with custom ops can be compiled and executed with IREE out of the box. Below I list out all the impacts of this project from my perspective:
What component(s) does this issue relate to?
No response
Additional context
No response