-
When training the system, the code should use the MSE or MAE loss and in the autoencoder, the L1 loss is used according to the paper. However, in the pytorch implementation, i found that you have used…
-
Hello, I'm training some larger models, and they were getting stuck in the warmup phase - that's when I realized, looking at the documentation string copied below, that I shouldn't be relying on the d…
-
Hello,
First of all I would like to thank you for releasing your implementation of your paper.
I have one question about the hidden size of the main auto-encoder
It seems to me that the UCLA…
-
Hello,
I read the paper a couple of times, and everything was clear to me (for now) except 2 points.
First why are you setting as the number of dimensions to be the clusters number? (when you ar…
-
Hi,
Thanks for making this code available.
I'm having an issue with my training. I have my reference dataset, and I have split like this:
```
lymphoid_train = lymphoid[msk]
lymphoid_trai…
-
I am testing this out for a music-similarity dataset, which does not have a defined number of clusters. Would your DTC library work the same for use with Agglomerative Clustering where `{n_clusters=No…
-
Hi all,
Thanks for sharing a great Model for Deep Clustering. I aim to leverge your excellent work to cluster images for my project.
Baed on your paper, the Network Structure is a basis of Sta…
-
Thanks for the useful library!
Trying to use the saving and loading function
`hcluster = n2d.load_n2d('models/har.h5', 'models/hargmm.sav')`
got an error
`RuntimeError: incompatible bytecode vers…
-
Hello Geron! Thank you for the great book, it is no understatement to say that it has helped me advanced my career! As a data science research intern at a Medical School, I have a question:
What is…
-
Hi psanch21,
Thanks for sharing the code and your thesis.
As we know the paper “Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders” provides the ACC of clustering in MNI…