The temperature in the CAT-VAE model is set in this way:
if batch_idx % self.anneal_interval == 0 and self.training:
self.temp = np.maximum(self.temp * np.exp(- self.anneal_rate * batch_idx),
self.min_temp)
'self.temp' is initialized by 0.5.
'self.min_temp' is initialized by 0.5 too.
'self.temp np.exp(- self.anneal_rate batch_idx)' is ALWAYS no larger than 'self.min_temp'.
So 'self.temp' is ALWAYS equal to 'self.min_temp', namely 0.5.
Something wrong?
The temperature in the CAT-VAE model is set in this way:
'self.temp' is initialized by 0.5. 'self.min_temp' is initialized by 0.5 too. 'self.temp np.exp(- self.anneal_rate batch_idx)' is ALWAYS no larger than 'self.min_temp'. So 'self.temp' is ALWAYS equal to 'self.min_temp', namely 0.5. Something wrong?