Closed jmitrevs closed 1 year ago
Before I pushed 7186346 the test results were:
test_binary_cnn.py::test_model2[io_parallel-Vivado] PASSED [ 16%]
test_binary_cnn.py::test_model2[io_stream-Quartus] FAILED [ 33%]
test_binary_cnn.py::test_model2[io_stream-Vivado] FAILED [ 50%]
test_binary_cnn.py::test_model2[io_stream-Vitis] FAILED [ 66%]
test_binary_cnn.py::test_model2[io_parallel-Quartus] FAILED [ 83%]
test_binary_cnn.py::test_model2[io_parallel-Vitis] PASSED [100%]
After, they all fail. So it was not a successful fix.
This is not complete, and the final test has been disabled, but I think the fixes that we have here should be added before we make 0.7.0 RC. The other stuff we can add after the RC is built or later.
The remaining to-dos:
If this is accepted, those to-dos should be moved to an issue.
This fixes a few things:
cast
is called with the correct mult_config in CNN cases in all cases. Previously it was correct in many cases but not all.cast
was fixed in the quartus case since you cannot cast ac_fixed
to ac_int
without calling .to_ac_int()
. Maybe one needs to add more templates here depending on whether the input types are ac_fixed or ac_int to cover all the options. (I understand ac_int
to ac_fixed
casting is allowed.) It's worth looking in more detail since we probably do not cover all cases.n_scale_bias
, which is not the same as n_in
when n_filt != -1
. They have been updated to match the regular batchnorm. (This was also coped to quartus, where it was missing everywhere.)
Description
As shown by #740, CNNs with binary quantizers don't currently work properly. This PR attempts to fix it.
Type of change
Tests
A pytest is added. However, it still shows errors in streaming.
Checklist
pre-commit
on the files I edited or added.