facebookresearch / VMZ

VMZ: Model Zoo for Video Modeling
Apache License 2.0
1.04k stars 155 forks source link

extract_features.py produces feature that is all the same for all the videos #31

Closed wenjie710 closed 4 years ago

wenjie710 commented 6 years ago

I am using my own dataset and I have made the database as you said. However when I tried to extract features using extract_features.py, I got confused results. I printed the features and found that the softmax layer gives the same number, like (Pdb) activations array([[0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111], [0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111], [0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111], [0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111, 0.01111111]], dtype=float32) since the number of labels I use is 90, the feature I get seems like an initialization. I am so confused. Could you give me some advice?

dutran commented 5 years ago

@wenjie710 can you try again?