meta-llama / codellama

Inference code for CodeLlama models
Other
15.4k stars 1.78k forks source link

Achieving Deterministic Output #228

Open antonkratz opened 2 months ago

antonkratz commented 2 months ago

(I posted a very similar question in the ollama repo. This question is how or if this can be achieved with the inference engine code that comes with code llama).

For a research project, I am interested in exploring the effect of different prompts. The problem is, when I change the prompt even slightly, and I get a different result, I am unable to say how much has changed in the output because I changed the prompt input and how much has changed in the output because of the random and pseudo-random effects because of concepts such as top-k, top-n and temperature.

Is it possible, in principle, to get a deterministic output? Is it technically possible to get a deterministic output in practice with the code provided together with code llama?

Basically, I want to get responses in a way that the same prompt generates the same output, at any temperature. There can and should be pseudo-randomness but it must be necessary for me to fix the seed. I want only changes that are caused by the prompt. Is that possible with code llama?