bytedance / MRECG

Apache License 2.0
35 stars 4 forks source link

Is it possible to quantize the model with W1AX such as (W1A1; W1A2; W1A4; W1A6 etc) #12

Open padeirocarlos opened 2 months ago

padeirocarlos commented 2 months ago

I wonder know if the method proposed by you can handles low bit-width quantization such as W1A1; W1A2; W1A4; W1A6 etc?

Have you ever tried to change to W1A2; W1A4; W1A6 all of them did not work out! What about you have tried it? What was the results? If not, theoretically what is can possible results?