1Konny / FactorVAE

Pytorch implementation of FactorVAE proposed in Disentangling by Factorising(http://arxiv.org/abs/1802.05983)
MIT License
262 stars 48 forks source link

Disentanglement metric #9

Closed abdulfatir closed 6 years ago

abdulfatir commented 6 years ago

Did you by any chance also implement their metric?

Great repository! 👍

1Konny commented 6 years ago

Hey @abdulfatir, thank you for your attention.

Yes, I had implemented the metric but I didn't publicly upload the code yet

ggand0 commented 5 years ago

Is it possible for you to release your implementation of the disentanglement metric? I have implemented the metric (the Higgins' metric) but I am having troubles reproducing the results reported in the paper.

abdulfatir commented 5 years ago

@pentiumx Are you talking about Higgins' metric or Kim's?

ggand0 commented 5 years ago

@abdulfatir I'm talking about the Higgins' metric, but I'm also planning to implement the Kim's metric.

So far I have attempted to reproduce the Figure 4 of the paper with my implementations of Beta-VAE and FactorVAE, but I was not able to reproduce the results. I used the same hyperparameters described in Appendix B to train the linear classifier, but I'm not getting similar accuracies. For example, even the Beta-VAE(beta=1) trained for 1 epoch gets a very high accuracy (>0.95). Latent traversals seem to work fine for beta=4 models though.

I could not find any existing implementations of these metrics on the web, so I am hoping to refer to @1Konny 's code if that's possible.