majianjia / nnom

A higher-level Neural Network library for microcontrollers.
Apache License 2.0
820 stars 235 forks source link

concatenat layer break my network #132

Open sifourquier opened 3 years ago

sifourquier commented 3 years ago

Hi I build a Convolutional network i traine it and after i add a concatenate layer with the output of intermediar layer and last layer ref https://github.com/majianjia/nnom/discussions/129

If i convert the "normal network" he work but if i add the concatenat layer he do not work. Then i have compare weights.h and i watch is realy similard with the exceptions of OUTPUT_RSHIFT BIAS_LSHIFT

I join the 2 weights.h weights.h are ok but do not have concatenate weights_internal_layer.h not work.

If i copy BIAS_LSHIFT and OUTPUT_RSHIFT form weights.h in weights_internal_layer.h that work fine.

[optimized_nnom.zip] (https://github.com/majianjia/nnom/files/6580358/optimized_nnom.zip)

majianjia commented 3 years ago

Haven't got a chance to see in detail but there is a constraint for the data that put into concat (as well as all other merging layers, add, sub, mult). They must have the same Q format. The script will try to unify the Q format for each data by searching a layer that can change Q format of each brench. There are some case failed because the bench doesn't have a layer that can change it (no Conv/dense).

Please see if this is the case.