kundajelab / bpnet

Toolkit to train base-resolution deep neural networks on functional genomics data and to interpret them
http://bit.ly/bpnet-colab
MIT License
141 stars 33 forks source link

Normalizing by SeqLen in Profile Loss #25

Closed mhorlacher closed 3 years ago

mhorlacher commented 3 years ago

Dear authors,

First off, amazing work - I really enjoyed your paper!

I have a question regarding profile loss normalization by sequence length. In the implementation of multinomial_nll in losses.py, the sum-reduced profile loss is normalized by seqlen, however, seqlen is defined as seqlen = tf.to_float(tf.shape(true_counts)[0]). Wouldn't this normalize the loss by the batch size, since the shape of true_counts is (batch, seqlen, channels)?

Thanks in advance for clarifying!

Avsecz commented 3 years ago

Hi Marc,

I think you are right. I've updated the code to say normalize by 'batch size'.

Best Ziga

mhorlacher commented 3 years ago

Thanks!