Open gillmac13 opened 4 years ago
Hi @gillmac13. It's a good solution and could also match YOLO Nano design. I'm just aware that the UpSampling2D is unnecessary since the attention factor could automatically broadcast to feature maps when doing Multiply. I can merge this implementation to code base. Many thanks for your suggestion~
Hi @david8862 ,
Part of my experiments is porting the models to tfjs, my final application being in JavaScript. I have found that the "pythonic" lambda layers cannot be converted by tensorflowjs_converter because they cannot be interpreted in JavaScript. This is a concern for the two Shufflenet backbones and the YoloNano structure. In the the latter case, I have tried to replace the lambda layer in the FCA block (yolo3_nano.py) by pure Keras layers... This is my tentative FCA block, same as yours, but with an ugly hack:
It works well, the models are good, tensorflowjs_converter does the job, and I can load the models in JavaScript. But I'm not sure if I have respected the purpose of the FCA block. Would you have an idea ?
Gilles