gorjanradevski / multimodal-distillation

Codebase for "Multimodal Distillation for Egocentric Action Recognition" (ICCV 2023)
MIT License
21 stars 3 forks source link

Performance after quantization? #8

Open beitong95 opened 1 month ago

beitong95 commented 1 month ago

I was wondering if you tried to quantize the mode to int8 as you mentioned efficiency in the paper. I would like to run the model on an edge device where only int8 accelerator is available.