rymc / n2d

A deep clustering algorithm. Code to reproduce results for our paper N2D: (Not Too) Deep Clustering via Clustering the Local Manifold of an Autoencoded Embedding.
GNU General Public License v3.0
126 stars 21 forks source link

Unable to replicate the results #15

Closed Mayurji closed 3 years ago

Mayurji commented 3 years ago

I am not sure this question comes under issues or not ? But I'm still gonna ask.

Hi @rymc, I am trying to replicate the results on MNIST dataset alone. I am little confused with the way the model is getting trained.

How the encoder model got trained, since we are using it to predict on images? Thanks in advance.

rymc commented 3 years ago

Hi,

Thank you for your interest.

The creation of a separate model on line 357 is done via an extraction of a reference to the encoder of the autoencoder. When the autoencoder is later trained, the encoder model is using the trained representation from the encoder of the autoencoder.

The prediction on line 385 is the process of encoding the input image using the autoencoder we trained. Perhaps the function name 'predict' is a little misleading as it is not predicting the class of the data, rather it is encoding the data.

As this is an unsupervised method, there isn't really any prediction step (in the classification sense) at all. Instead, we are clustering the images. We use the labels of the images only to measure the quality of the resulting clusters.

If you've any questions about the ideas behind this, feel free to send me an email.

If you've any issues reproducing the results, feel free to open a new issue.

Mayurji commented 3 years ago

Thanks @rymc, I have managed to replicate the results using autoencoder implementation in Pytorch.

abduallahmohamed commented 3 years ago

@Mayurji are you able to share your Pytorch implementation?

Mayurji commented 3 years ago

@abduallahmohamed, Yes I am able to replicate the results on MNIST and for other dataset, we can create pytorch dataloader. Check the github repo for model implementation.