Closed AmitMY closed 7 years ago
This is NN question in general, I think.
That's simple: If you have 1 as expected output, and 0 as resulting output, error (before loss function) will be 1. If you have 0 as resulting output and 10 as expected, error will be 10.
For multiple outputs mean (as an average value) of error is taken, same for multiple runs, so if you are expecting [0, 10] as an output and received [5,5], error will be 5.
Only thing I can accidentally miss is whether error or loss is an output. If loss, then loss function (RMSe or LogLoss depending on use-case) is applied, still, error represents an error, nothing less and nothing more, not percent or something.
Ok, thanks! So it is not percentage of failures as I thought, but a different evaluataion.
As I understand it,
trainer.test
method returns an error, which represents the respective error of the network.So for example, if I see 0.004998819355993572, I assume 0.4% error.
However, I just trained a network and got an error 2.5, which for the same logic, would be 250% error.
So what does the number represent, if not rate?