rosinality / glow-pytorch

PyTorch implementation of Glow
MIT License
519 stars 96 forks source link

`n_bins` value #17

Open talesa opened 4 years ago

talesa commented 4 years ago

I think there might be a tiny mistake in the dequantization process at the moment.

I think that https://github.com/rosinality/glow-pytorch/blob/master/train.py#L99 should be n_bins = 2. ** args.n_bits - 1. rather than n_bins = 2. ** args.n_bits since as far as I understand, in the following code snippet the minimum difference in the input levels/bins values (a[1:]-a[:-1]).min() should be the same as 1/n_bins (run after image, _ = next(dataset) on line 109 in train.py https://github.com/rosinality/glow-pytorch/blob/master/train.py#L109)

In[1]: a = torch.unique(image.reshape(-1))

In[2]: (a[1:]-a[:-1]).min().item()
Out[2]: 0.003921568393707275

In[3]: 1/255.
Out[3]: 0.00392156862745098

In[4]: 1/256.
Out[4]: 0.00390625

Also, it's a bit confusing that by default, the n_bits is set to 5, whereas by default n_bits for CelebA is 8, I'd change it to 8.

rosinality commented 4 years ago

Yes, it seems like that treatment for n_bits is problematic compared to the official implementation. (I don't know why I have missed it.)

I have used n_bits = 5 because the official implementation have used it for celeba-hq.

talesa commented 4 years ago

I have used n_bits = 5 because the official implementation have used it for celeba-hq.

It seems to me that this implementation is only using the 8-bit version of the dataset (the default if I'm not mistaken), as it doesn't seem to be decreasing the number-of-bits of the input data like in https://github.com/openai/glow/blob/654ddd0ddd976526824455074aa1eaaa92d095d8/model.py#L153-L158, correct me if I'm wrong somewhere, I don't know the openai/glow repo much.

rosinality commented 4 years ago

I have forgot to add it. Anyway, 97081ff will resolve the issue.

talesa commented 4 years ago

Thanks a lot! I'm sorry I haven't created a pull request straight away myself, I just wanted to check it with you first!