tomrunia / TF_FeatureExtraction

Convenient wrapper for TensorFlow feature extraction from pre-trained models using tf.contrib.slim
180 stars 61 forks source link

Why logits is the feature? #8

Open Amitayus opened 6 years ago

Amitayus commented 6 years ago

Thank you for your sharing. In your Usage introduction, Logits layer of Inception-v4 is exacted as feature using example_feat_extract.py. I wonder that why the logits is the feature be extracted?

alexandercameronh commented 6 years ago

@Amitayus I believe that the feature vector that you call to be extracted depends on the ultimate usage of the feature vector. In this case, the Logits feature vector is effectively a probability distribution among the 1000 possible objects that can be extracted from the image

@tomrunia How would one extract a feature vector from somewhere earlier in the network? For example, the final convolutional layer?

tomrunia commented 6 years ago

@alexandercameronh You can just set the command line parameter --layer_names where the layer names are comma separated. Of course the layer names need to correspond to your network architecture. Your can use the feature extractor's method print_network_summary() to find out the layer names.

Amitayus commented 6 years ago

@alexandercameronh To my understand, if the Logits output is regarded as feature vector, then there is no need for the existing of classifier. Feature vector should be independent of the specific classification tasks.

bkwwangjie commented 6 years ago

@tomrunia Hello,I want to use the eigenvector of the image in conjunction with the text, which layers should I extract?

alexandercameronh commented 6 years ago

@bkwwangjie did you figure out the answer to this? If so, which layer(s) did you extract?

Amitayus commented 6 years ago

@alexandercameronh I remove the fully connected layer (regrading as classifier) and extract vector of end_points['PreLogitsFlatten'] as feature of the input.

zengsn commented 6 years ago

@Amitayus Could please show the code?