-
There is a google paper "Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations"
https://arxiv.org/abs/1811.12359
They have some nice evaluation metrics for u…
-
Abstract: Disentangled representations, where the higher level data generative factors are reflected in disjoint latent dimensions, offer several benefits such as ease of deriving invariant representa…
-
## Summary
This paper studies the geometry of state distributions learned with mutual information skill learning for the purpose of theoretical task adaptation analysis. The authors propose Least S…
-
In the official TensorFlow blog posts, it was said that `tf.layers` would be depreciated and `tf.keras.layers` would be the preferred method. However in _architectures.py_ `tf.layers` has been used e…
-
Hi,
Could you please add our new work into this list? The paper is about learning Disentangled Representations to realize domain adaptation.
You could find the paper here: https://arxiv.org/abs/…
-
>Parameterizing the approximate posterior of a generative model with neural networks has become a common theme in recent machine learning research. While providing appealing flexibility, this approach…
-
* 논문제목 : SimTS: Rethinking Contrastive Representation Learning for Time Series Forecasting
* 분야 : time series
* 논문 링크 : https://arxiv.org/pdf/2303.18205.pdf
* 발표 자료 : https://cottony-wedelia-967.no…
-
Expression Transfer:
"GANimation: Anatomically-aware Facial Animation from a Single Image" (Pumarola et al., 2018)
"MeshTalk: 3D Face Animation from Speech using Cross-Modal Disentanglement" (Rich…
-
In your paper, you use the decomposition method, but I don't see the module in the code. Can you tell me where it is?
-
Hi Mingjie, very cool project and very nice work!! This is exactly what we need -- more comparisons and analyses. Just wondering if you have any insights you can share? Although I bet you would rather…