Open bisnu-sarkar-inverseai opened 1 month ago
Please see https://github.com/google-ai-edge/ai-edge-torch/blob/main/docs/pytorch_converter/README.md#debugging--reporting-errors for some tips for debugging conversion errors.
In this case you appear to be encoutering an error during torch.export and the model source needs to be modified.
Thanks for your reply. I am getting issue during torch.export.export()
Hi, just to follow up here. Are you certain you need to inherit from TransformerEncoderLayer
? That is, do you intend to use NestedTensors? If not, you might be able to accomplish your goal with a more "manual" approach and inheriting from nn.Module
instead.
In terms of creating an encoder, please see the examples/t5/t5.py
example as an example of authoring and converting a transformer-based encoder. It has T5-specific pieces, obviously, but you should be able to strip it down to something that is exactly like a BERT encoder.
Description of the bug:
I'm experiencing an error when trying to convert my PyTorch model to TensorFlow using the ai-edge-torch library. The error occurs when _sa_block is called from torch.nn.TransformerEncoderLayer. Below is the portion of my model causing the issue.
Actual vs expected behavior:
When calling _sa_block from torch.nn.TransformerEncoderLayer getting issue:
torch._dynamo.exc.Unsupported: call_method NNModuleVariable() _sa_block [TensorVariable(), LazyVariableTracker(), LazyVariableTracker()] {}
Any other information you'd like to share?
Thank you for your attention to this matter. I look forward to your response and any guidance you can provide.