Open afonso-sousa opened 4 years ago
Hello. First of all, congratulation on your work. I would like to reproduce your work but I am facing a strange problem. When trying to use your DSQConv layer, I get the following error: "torch.nn.modules.module.ModuleAttributeError: 'DSQConv' object has no attribute 'running_lw'" Changing register_buffer for simple attribute assignment did not fix it. Can you reproduce the problem or help me in any way?
I would also like to ask you how can I store the models in an encoded way to compare the storage savings of lower bit range solutions.
Thank you in advance.
Did you figure out a solution? i'm having the same issue
Hi, I had the same problem with you, and I fixed it. It seems that the versioning of "PyTransformer" respository has some problems. If you directly clone the respository, for example, the file "/PyTransformer/transformers/quantize.py" will has a length of 146 lines, but in the original respository, it should has 217 lines. You can download the "PyTransformer" in .zip format and unzip it to use.
Hello. First of all, congratulation on your work. I would like to reproduce your work but I am facing a strange problem. When trying to use your DSQConv layer, I get the following error: "torch.nn.modules.module.ModuleAttributeError: 'DSQConv' object has no attribute 'running_lw'" Changing register_buffer for simple attribute assignment did not fix it. Can you reproduce the problem or help me in any way?
I would also like to ask you how can I store the models in an encoded way to compare the storage savings of lower bit range solutions.
Thank you in advance.