CompPhysVienna / n2p2

n2p2 - A Neural Network Potential Package
https://compphysvienna.github.io/n2p2/
GNU General Public License v3.0
214 stars 81 forks source link

Training and test results are too large at 0th epoch #156

Closed otayfuroglu closed 2 years ago

otayfuroglu commented 2 years ago

Hi, I just started using n2p2. As you can see follow, the initial prediction values (train and test) are quite large in the output;


energy ep E_count E_train E_test E_pt force ep F_count F_train F_test F_pt timing ep count train error other epoch total

ENERGY 0 0 2.68504E+03 2.63123E+03 0.0 FORCE 0 0 3.03385E+03 3.05488E+03 0.0 TIMING 0 0 0.0 91.9 8.1 4.32 4.32

ENERGY 1 282 1.61192E+00 1.62651E+00 16.6 FORCE 1 1171 3.59105E+00 2.96717E+00 83.4 TIMING 1 1453 99.5 0.4 0.1 313.37 317.69

.....

Even though I have scaled the symmetry functions and normalized related data, I couldn't understand why it turned out to be large error results at 0th epoch. Do you think it's normal?

I have attached the relevant files learning-curve.out.txt input.nn.txt

Thanks in advance. Best regards, Omer

Kyvala commented 2 years ago

Hi Omer,

The absolute value of error depends on your units. Large errors at the 0th epoch are expected as weights are initialized randomly. So you should care only about trained errors, not at the 0th epoch.

All the best, Lukas

otayfuroglu commented 2 years ago

Hi Lukas, Thank you for this kind reply and guidance.

Best regards, Omer