Closed solitary-1 closed 4 years ago
Hello,
What the error is saying is that basically quantization is not enabled. Are you running training using brevitas_bnn_pynq_train --network LFC_1W1A --experiments /path/to/experiments
?
Alessandro
Hello,
What the error is saying is that basically quantization is not enabled. Are you running training using
brevitas_bnn_pynq_train --network LFC_1W1A --experiments /path/to/experiments
?Alessandro
Follow your steps, but there was an error in running the line of command.Look forward to your reply.
Fixed by #200 .
We have successfully trained the model of "LFC_1W1A" in bnn_pynq example. When we tried to retrieve "int_weight" from the model, we encountered an exception which stopped us from doing so.
Here is the code we used to print out "int_weight":
def forward(self, x): for mod in self.features: if isinstance(mod, QuantLinear): print(mod.int_weight()) # where we encountered the exception
We want to know if there is a problem with our code or our method of getting the quantized weights. Thanks in advance.