Closed SAN4EZ7714 closed 3 years ago
Hello!
Now I am developing my own quantization algorithm and testing it on my private models. I am going to add model quantization example soon.
I've seen new commits about quantization. How about a small example?
I have added quantization examples (test_003 and test_010).
Hello!
Is working with INT8 models currently supported?
In files perf.sh, test.sh, check.sh the quantization check is disabled, as well as test_003i and test_009i. If I try to enable them manually, it start require file quant.xml, but it is missing from the project.