codeplea / genann

simple neural network library in ANSI C
https://codeplea.com/genann
zlib License
1.99k stars 237 forks source link

Different results #59

Closed lazy-dude closed 8 months ago

lazy-dude commented 8 months ago

Hi,

I am using genann in a project. With same training , after each run I get a different result. Is there a way to get roughly same result after each run ?

ujh commented 8 months ago

How do you handle the seeding of the random number generator? I guess that would be the main determining factor.

lazy-dude commented 8 months ago

I have srand(time(NULL)) called during my training function. No particular one during running of the net.

ujh commented 8 months ago

That then means that it's different every time you're running it, so the results will vary. If you try just picking a number and use the same one every time you should get the same results.

ujh commented 8 months ago

Not sure I'd recommend that (and that PRNG in particular) as that really isn't very random, but if it does the job then that should be good enough.

lazy-dude commented 8 months ago

I tried a simple run of genann over a small number of samples. With new compile I get repetitive results. So maybe the flaw is from somewhere else.

codeplea commented 8 months ago

If you want the same results, do srand(0) at the beginning of your program.

By using srand(time(NULL)), you are basing the results off of the current time you run it. So you'll still get exactly the same results as long as that line is executed at exactly the same time.

You can also change the randomization function here: https://github.com/codeplea/genann/blob/4f72209510c9792131bd8c4b0347272b088cfa80/genann.h#L39

Finally, you could set the initial weights manually to whatever values you want, and avoid any attempt at randomness altogether. See https://github.com/codeplea/genann/blob/4f72209510c9792131bd8c4b0347272b088cfa80/genann.c#L195

codeplea commented 8 months ago

It wasn't totally clear to me if you were talking about training or running. Loading a pre-trained net from a file and running it on the same input should always produce exactly the same output. If that's not happening, upload a simple example code that shows that. Otherwise, I would assume you have a bug in your program.