Open khlee369 opened 2 years ago
It is not open sourced in this release.
There is a FX2AIT project probably will be open sourced in the future.
On Tue, Oct 4, 2022 at 05:49 Lee kwang ho @.***> wrote:
I referred official tutorial but I can't find the automatic conversion from PyTorch module to AIT module like ONNX->TensorRT.
https://facebookincubator.github.io/AITemplate/tutorial/how_to_infer_pt.html#define-a-pytorch-module
Is there any parser or converter building AIT module from PyTorch module directly?
— Reply to this email directly, view it on GitHub https://github.com/facebookincubator/AITemplate/issues/16, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJTLXQKUYTIHZ7COVRFOI3WBQRUBANCNFSM6AAAAAAQ4RKRQQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>
-- Bing Xu
Strongly agree with @khlee369 on this feature. Right now the conversion example for ResNet seems incredibly tedious, so I have low confidence I could make something work for a custom architecture (even though I have experience converting PyTorch to other frameworks like OpenVino, CoreML, and ONNX). Until there's a more generic parser, I think it's going to be tough to convince people to switch
Overall the pitch of the project (particularly avoiding Tensor RT) sounds quite appealing, so I hope this progresses!
Extremely want to this feature. Also, do you have plan to support different frameworks other than PyTorch (eg. ONNX) parse to AITemplate model?
Now with pytorch 2.0's torchdynamo's more accurate FX graph generation, maybe a open sourced FX2AIT will be even more helpful?
I referred official tutorial but I can't find the automatic conversion from PyTorch module to AIT module like ONNX->TensorRT.
https://facebookincubator.github.io/AITemplate/tutorial/how_to_infer_pt.html#define-a-pytorch-module
Is there any parser or converter building AIT module from PyTorch module directly?