idiap / attention-sampling

This Python package enables the training and inference of deep learning models for very large data, such as megapixel images, using attention-sampling
Other
97 stars 18 forks source link

What's the softmax temperature? #23

Open AlbertoSinigaglia opened 1 year ago

AlbertoSinigaglia commented 1 year ago

I'm building this from scratch to avoid the additional CPP code, and it seems to be working, however, when I compute


and use that as logits for the softmax, the "a" term on large images tend to be very small, and "f" is normalized, so the scalar product gets logits very close to 0, and since softmax is not scale invariant, I get close to uniform predictions...

Thus, I think that you are using some form of temperature, but nor in the code neither in the paper i see any reference to it... can I have some clarification?

At the moment the best I was able to to do (not to handpick the temperature) is to make it trainable:

class SoftmaxWithTemperature(tf.keras.layers.Layer):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)
        self.t = self.add_weight("temperature", (1,), initializer=tf.initializers.ones())
    def call(self, inputs):
        return tf.nn.softmax(self.t * inputs)
classification_network = tf.keras.models.Sequential([
    SoftmaxWithTemperature()
])
classification_network(tf.reshape(features * probabilities, ...))

And while training, I see the temperature slowly increasing to 1.2, which is still not enough (at least in my case)

AlbertoSinigaglia commented 1 year ago

I ended up normalizing the expected feature like this:
)

With this normalization, no temperature is needed at it seems to have an "expected good performance"