meta-llama / llama-models

Utilities intended for use with Llama models.
Other
4.88k stars 838 forks source link

Non Deterministic response generated for same set of config #176

Open aabbhishekksr opened 1 month ago

aabbhishekksr commented 1 month ago

model.config.temperature = 0.0 model.config.do_sample=False model.config.use_bfloat16 = True output = model.generate(**inputs, max_new_tokens=2048)

setting only these params, while instantiating model, still getting different results in every iteration. what could be the issue?