Open ayoubachak opened 4 months ago
I'm using v0.2.16
btw.
Hi @ayoubachak are you able to try again with 0.2.23? Available on https://lmstudio.ai
Hello, thanks a lot for the quick response, I just upgraded to the latest version, now both the tests give different results XD:
using 1715852364
:
using 42
:
using 42
gave me the same results before, but now it doesn't. I upgraded to version 0.2.23
I'm working on a project , and I need to track the seed used in each generation so that I can reproduce the output when needed using the same config ( and same seed ). However, I find that it's not always the case.
I tried using the seed
42
, and it gave me the exact same result each time with the same config. When I tried a larger number1715852364
( which I usually get from the epoch time ) I found out that it gives different results.Output with seed
42
(Exactly the same twice ):Output with seed
1715852364
( Completly different ):Here is the code I used to produce this bug (which is part of my code project):
lm_studio.py
:base.py
:configs/lm_studio.config.json
:I only know that LM Studio uses llama.cpp, but not sure if it has to do with the size of the seed, if so what's the maximum integer where the same seed will always give the same results ?