swechhasingh / Handwriting-synthesis

Implementation of "Generating Sequences With Recurrent Neural Networks" https://arxiv.org/abs/1308.0850
MIT License
212 stars 30 forks source link

Loss function #8

Open aleksandersawicki opened 2 years ago

aleksandersawicki commented 2 years ago

At the beginning I would like to say that the project is very impressive. I am very grateful that you made it available online. However, I have some doubts about the correctness of the loss function

In lines https://github.com/swechhachoudhary/Handwriting-synthesis/blob/f37fa68524f55257b7bef7df3246906625b01a0e/utils/model_utils.py#L31-L32

we can find 0.5 * torch.log(epsilon + 1 - rho.pow(2))

If we refer to equation 24 from https://arxiv.org/pdf/1308.0850.pdf, it seems to me that we should have there rather 2 * torch.log(epsilon + 1 - rho.pow(2))

please let me know if I am wrong, if you used another repository please tell me which one.

Greetings once again.

bryandam commented 12 months ago

@aleksandersawicki: I know I'm necroposting but did that change improve the output at all for you?

I tried with and without and neither seems to get it to create reasonable output. I couldn't even get it to match the output of the demo video.