Neuroglycerin / neukrill-net-work

NDSB competition repository for scripting, note taking and writing submissions.
MIT License
0 stars 0 forks source link

Which is our log loss estimate? #28

Closed gngdb closed 9 years ago

gngdb commented 9 years ago

We weren't sure the other day when looking at the monitoring logs from training a convnet, but y_objective and y_nll looked pretty similar. NLL should be equal the log loss, but then what is the objective. Trained the alexnet_based.yaml network with simple resizing 48x48 for preprocessing and got the following final monitoring logs:

valid_objective : 2.19527244568
valid_term_0 : 1.9964928627  
valid_term_1_weight_decay : 0.197501763701
valid_y_col_norms_max : 1.93658196926
valid_y_col_norms_mean : 1.88180267811
valid_y_col_norms_min : 1.71785604954
valid_y_max_max_class : 0.998774349689  
valid_y_mean_max_class : 0.735651731491
valid_y_min_max_class : 0.182999402285
valid_y_misclass : 0.339673906565
valid_y_nll : 1.33173024654  
valid_y_row_norms_max : 0.885183215141  
valid_y_row_norms_mean : 0.643358230591
valid_y_row_norms_min : 0.481415838003

In this one, objective and nll diverged. Could be due to different cost function from the other network.

They say you should not be afraid of asking stupid questions. I hope they're right, because this is going to look pretty stupid in retrospect.

gngdb commented 9 years ago

The NLL of a neural network is the same as the log loss; that's definitely right. However, the score reported on validation sets as above differs from that calculated by our scripts; whether run on the validation or holdout set. Those match the leaderboard, so I don't know what's going wrong. Maybe they're using different logs?

fmaguire commented 9 years ago

A bit redundant to #43