Closed kheimpel closed 6 years ago
This is because the length of single example in our version is different from the slim one, i.e., 496 vs 96, or 4.96 vs 0.96. The different lengths could result in different feature dimensions after the flatten operation. To remove the problem, you can change the parameter into "NUM_FRAMES =96", "EXAMPLE_WINDOW_SECONDS = 0.96", and "EXAMPLE_HOP_SECONDS = 0.96".
Thanks a lot - works fine 👍
When using include_top=True, load_weights=True and your weight file, i get the following error:
ValueError: Dimension 0 in both shapes must be equal, but are 63488 and 12288. Shapes are [63488,4096] and [12288,4096]. for 'Assign_163' (op: 'Assign') with input shapes: [63488,4096], [12288,4096].
Loading weights without top layers seems to work fine.
Any ideas?