Closed XA23i closed 4 months ago
Thanks for your interest and question. Please check our supported model list here.
Thank you for quick reply. By the way, what can i do to quantize models that are out of the supporting lists.
Yes, you can, as long as it is a standard PyTorch-based model. @XA23i
That's great. I will have a try.
@XA23i Can you share what models you'd like to quantize but are not on the support list?
Hi, I see that we can quantize our model by modelopt.torch.quantization.quantize(model, ...). I am wondering what are the supporting models, does any pytorch model make sense?