Closed jeannotes closed 11 months ago
对于vitis后端,如果当前是一个conv,其实已经对这个conv做了fake convolution,为什么在当前层后面还加一个act quantization,不是重复吗,意义在哪儿
QConv一般只是做了weight的fake quant,输入的fake quant需要插入额外节点
能再解释详细一点吗,没怎么理解
This issue has not received any updates in 120 days. Please reply to this issue if this still unresolved!
对于vitis后端,如果当前是一个conv,其实已经对这个conv做了fake convolution,为什么在当前层后面还加一个act quantization,不是重复吗,意义在哪儿