Open ozancaglayan opened 2 months ago
Hello,
With the temperature set, you are changing the probability distribution by making it either sharper or flatter. We have a softmax after multiplying score with the temperature. Even with the same seed, if the distribution is very flat, it could lead to the variety in the result (maybe precision error of fp). Try to set a temperature < 1 and see if the result is deterministic.
Hi,
Thanks! I was actually trying with temperatures of 0.2, 0.4, 0.8 etc.
Hello,
I'm experimenting with the
set_random_seed()
method of CTranslate2 (through faster whisper). Without temperature based decoding, the results from Whisper ASR models are always the same on CPUs and GPUs for the same file, e.g. I don't have any other source of stochasticity.Previously, I was calling the
set_random_seed()
only at model creation. Now I changed the logic to set it to the same value before everygenerate()
call. But still, I can't make the sampling deterministic. I tried on CPU and GPU, both give different results.Any idea what am I missing? Thanks