rohitrango / objects-that-sound

Unofficial Implementation of Google Deepmind's paper `Objects that Sound`
82 stars 16 forks source link

Cross entropy #1

Closed Relja closed 6 years ago

Relja commented 6 years ago

Hi,

I just randomly bumped into this - nice work.

I briefly saw the code and have one quick suggestion - typically it's a good idea to exploit functions which merge softmax and cross entropy loss into one operation during training, this is for numerical stability (softmax involves an 'exp', and then cross entropy a 'log'). I'm not familiar with pytorch but this seems relevant: https://pytorch.org/docs/master/nn.html#torch.nn.BCEWithLogitsLoss

Regards, Relja

rohitrango commented 6 years ago

Hi,

Thank you for the suggestion. I was aware of this issue for Tensorflow (I think the function was softmax_cross_entropy_with_logits) but wasn't sure for Pytorch. Anyway, I'll definitely implement it.

Also, it's really very motivating to get valuable feedback from the authors of the paper. The paper is very clear, and I didn't feel major glitches in understanding it. :smile: If there's more you wanted to ask or say, please feel free to.

Sincerely, Rohit

Relja commented 6 years ago

No problem, if you have any further question, feel free to email me. It's always nice to see my work reimplemented / replicated!