omoindrot / tensorflow-triplet-loss

Implementation of triplet loss in TensorFlow
https://omoindrot.github.io/triplet-loss
MIT License
1.12k stars 284 forks source link

What does 'embedding_mean_norm' mean? #46

Closed ykk648 closed 5 years ago

ykk648 commented 5 years ago

Could you give me some clues or papers? thx

omoindrot commented 5 years ago

If you refer to these lines:

    with tf.variable_scope('model'):
        # Compute the embeddings with the model
        embeddings = build_model(is_training, images, params)
    embedding_mean_norm = tf.reduce_mean(tf.norm(embeddings, axis=1))
    tf.summary.scalar("embedding_mean_norm", embedding_mean_norm)

The mean embedding norm is just a way to compute the average L2 norm of the embeddings. This is just the "size" of these embeddings (how big they are on average).

I just use this to monitor the size of the embeddings during training, to make sure they don't collapse to 0 or go too high.

ykk648 commented 5 years ago

thanks, embedding_mean_norm reflects the degree of ambiguity of human face and someone used it for image quality evaluation.