666DZY666 / micronet

micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
MIT License
2.2k stars 478 forks source link

ModuleNotFoundError: No module named 'util_wt_bab' #24

Open nce3xin opened 4 years ago

nce3xin commented 4 years ago

你好!首先感谢分享,但在二值量化模型的使用中,出现找不到util_wt_bab的情况,这好像并不是第三方包,文件结构里也没有找到该文件,请问应该如何解决呢?谢谢!

nce3xin commented 4 years ago

刚才看漏了,这个文件是存在的,但是按使用说明的方法,会提示ModuleNotFoundError: No module named 'util_wt_bab'错误

ghost commented 4 years ago

在quantization/WqAq/dorefa 做八位量化的时候也碰到了类似的问题ModuleNotFoundError: No module named 'util_wqaq',请问一下,您解决这个问题了吗??

yulei1234 commented 4 years ago

from models.util_wqaq import Conv2d_Q

nce3xin commented 4 years ago

在quantization/WqAq/dorefa 做八位量化的时候也碰到了类似的问题ModuleNotFoundError: No module named 'util_wqaq',请问一下,您解决这个问题了吗??

还没有……

ghost commented 4 years ago

我已经解决了的,我把那个引用的模块全部复制,放到那个模型的代码里面就可以跑通了

王良

邮箱:18729264891@163.com |

签名由 网易邮箱大师 定制

On 04/04/2020 14:44, nce3xin wrote:

在quantization/WqAq/dorefa 做八位量化的时候也碰到了类似的问题ModuleNotFoundError: No module named 'util_wqaq',请问一下,您解决这个问题了吗??

还没有……

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

666DZY666 commented 4 years ago

已修正,大家可以再试试。