Open dungng27 opened 1 year ago
It turns out that I can modify these ops by using torch functions rather than tensor's functions and register these. The problem is solved!
It turns out that I can modify these ops by using torch functions rather than tensor's functions and register these. The problem is solved!
This is just a temporary fix and not applicable to other ops. I wonder if there is a better work around.
@dungng27
It turns out that I can modify these ops by using torch functions rather than tensor's functions and register these. The problem is solved!
Can you elaborate a little more on this with a dummy example? I'd really appreciate it.
I'm also stumped on this, with the specific operators:
aten::meshgrid
aten::split_with_sizes
Did you manage to add validation for quantized model from MMEngine? Can you share please?
Hi, I'm trying to quantize and compile a PyTorch model with some Aten operations not supported by Vitis-AI yet. Specifically, I'm deploying the model (quantizing the model with test mode) but some errors occured:
Here is the script i run to quantize and deploy the model:
with the command:
I'm using Vitis-AI 2.5 for stability. I read the docs about Register Custom Operation but I don't know how to apply this workflow to register these custom Aten ops. Could someone show me how? Many thanks.