Open Blair-Johnson opened 4 years ago
My model started functioning closer to it's expected behavior after calling img.pix_to_ai()
before passing the image into my model. This isn't really explicitly stated in the docs, but I think this probably performs normalization or something behind the scenes. Either way, it seems to be absolutely necessary. My models are still performing worse than when they are run in the tflite interpreter, but calling that image method definitely made huge improvements.
img.pix_to_ai()
is copy image content to kpu input buffer, if the image from sensor.snapshot()
, it already have, if not, e.g. read from file, or some image ops like copy resize etc. you have to call pix_to_ai()
before run kpu
Hi, I have a number of TFLITE models that I have trained for a specific image classification task, and they run well on the Maix Dock board. I have used ncc to compile them into Kmodel files for use with the k210 on my board. My issue is that the models seem to have lost almost all of their classification performance when compiled to kmodels. The sipeed devs have made a number of functioning demos, and so I'm curious what the best practices are for successfully using and converting custom models to run on the k210. I'm having difficulty debugging performance problems, so I was wondering if anyone knows of easy mistakes I may have made during the process.
Overall the behavior is very similar to when I accidentally pass an image into my model without normalizing it, the model seems heavily biased toward one class or will rapidly flash back and forth between two classes with almost no change to the input image.
My models are in the kmodel4 format and I'm running the latest firmware on my board.