Open Dranero opened 3 years ago
The fualt isn't at numpy.
np.random.seed(1)
for i in range(2000):
print(np.random.randint(1, 10000))
This functions right
The pure set_random from numpy get's influenced by something, as the seed in the upper exaple differ completly from the seeds in an actual training
The pure set_random from numpy get's influenced by something, as the seed in the upper exaple differ completly from the seeds in an actual training
What do you mean with set_random
from NumPy? np.random.seed(seed)
in the set_random_seeds()
from helper.py
?
And what do you mean with "seeds in an actual training"? Isn't that example from an actual training on the workstation?
Are you sure the problem is caused by the seeds or could it be a problem with the memory environment/CTRNN/training itself?
I mean, that the seeds, which will be generated by the code snippet, are not the seeds, which are used by the environment, so i couldn't map the seed generation strategies. I think there is another spot, where seed will be generated, but i couldn't find it.
I implemented a seed option, which will use the exact same seed in every generation to further investigate into this, but the branch isn't tested enough, because I underestimated another change i made in that branch. So this will need some time, but I'm on it :)
The seed repeats itself. A good eample is https://raw.githubusercontent.com/neuroevolution-ai/CTRNN_Simulation_Results/master/evaluation/bamemory/ctrnn/2021-02-23_05-18-06/Log.txt
Every 36 generations the Fitness repeat itself