steckdenis / nnetcpp

Flexible and fast neural network library (feed-forward, RNN, GRU, LSTM, etc)
58 stars 19 forks source link

tests assertion failtues #2

Open lonnietc opened 8 years ago

lonnietc commented 8 years ago

Greetings,

I have come across your nnetcpp library and was investigating its use recently.

I was able to compile the code and then ran the "./test all" which has resulted in assertion failures:


. . . Final MSE: 0.273338 test_merge.cpp:62:Assertion Test name: TestMerge::testSum assertion failed

test_recurrent.cpp:110:Assertion Test name: TestRecurrent::testLSTM assertion failed

Failures !!!

Run: 8 Failure total: 2 Failures: 2 Errors: 0

Any ideas on why?

Cheers, Lonnie

steckdenis commented 8 years ago

Hi,

Unit testing a neural network library is quite difficult. What I do is that I train a small neural network (based on different layers implemented in the library), and then check that its error is below a threshold. I've been quite aggressive regarding the thresholds.

I've run the tests on my computer, and they pass if run separately "tests recurrent|merge|perceptron". If I run "tests all", some tests fail (always the same ones, but a different set than yours). This is probably because the solution found by the neural network is non-deterministic and depends on the libstdc++ random number generator (for weight initialization for instance). Your RNG may be different than mine, and randomness of one test can influence the others. The library should still be completely functional.

If you use this library and have a network that cannot learn, note nnetcpp is quite sensitive to its parameters (learning rate too high (should be around 1e-4 or 1e-5), too many or too few neurons, wrong activation functions (tanh works very well)). Usually, decreasing the learning rate and giving time to the network is enough to have it learn (by "time", I mean hundreds of thousands of iterations in some cases).

Best regards, Denis

hailiang-wang commented 6 years ago

I came to the same error too.