LiuLei95 / PyTorch-Learned-Image-Compression-with-GMM-and-Attention

This repo is implementation for Learned Image Compression with Discretized Gaussian Mixture Likelihoods and Attention Modules in pytorch.
Apache License 2.0
65 stars 3 forks source link

gaussian1 = torch.distributions.laplace.Laplace(mean1, sigma1) ? #1

Open JiangWeibeta opened 3 years ago

JiangWeibeta commented 3 years ago

I think here should be torch.distributions.normal.Normal. Why you use Laplace here? Could you tell me the reason? thank you.

MillionLee commented 3 years ago

I also want to know.

LiuLei95 commented 3 years ago

This problem i also interested when i wrote this code. The base code is copy from here, and i also tested torch.distributions.normal.Normal and Laplace in the code. The results shows that two methods have the similar results. If you know the functional images for Gaussian distribution and Laplace distribution, you can see that the two functional images is similar.

MillionLee commented 3 years ago

Thanks for your reply! I also test Gaussian and Laplace at the same time, and they have the similar performance as you said. I think this is because three Gaussian distributions or three Laplace distributions can both model the latent representation well.