-
## 一言でいうと
点群を生成する手法についての研究。形状の推定=>形状から点群の推定という2段階の推定を行なっており、全体をContinuous Normalizing Flow(CNF)でモデル化している。点群推定にとっての事前分布となる形状分布の表現力を上げるために、Encoder(Q)を使い形状分布のCNFを学習している。
![image](https://user-image…
-
We could then use this class to document methods like ``tfep.app.TFEPMapBase.configure_flow()``.
-
I followed this repo, paper and glow paper. My loss looks like this
_logp = GaussianDiag.logp(None, None, (z))
obj = logdets + logp
loss = -obj / (tf.math.log(2.) * tf.ca…
-
Congrats and excellent work!
Maybe a dumb question, what's the difference between this and VAE, and how are they related?
-
Now that we have a prototype of a working C^k coupling layer (by #10), we can try to see if we can build a full flow that can be trained with the score.
This issue is to document our tests and resu…
EiffL updated
2 years ago
-
Hello, this is a great job
There is a problem, in the experiment Conditioner=DAG, Normalizer=Monotonic
How to calculate the inverse transformation of Graphical Normalizing Flow?
That is, how to get…
-
# Title
NODEA - Neural Ordinary Differential Equations Anonymous
# Description
This sessions is supposed to be an introduction to neural ordinary differentiable equations (Neural ODEs). As on…
-
A user requested we look into accepting NetFlow v9 as a flow data source. I believe there are netflow inputs for logstash and filebeat already, so the plumbing is there. The majority of the work would…
-
I implemented the [CNF example](https://docs.kidger.site/diffrax/examples/continuous_normalising_flow/) and added it as a head to a transformer to do inference over some continuous variables in a prob…
-
The training and sampling strategy is similar to siggrapah 2023: **Iterative 𝛼-(de)Blending: a Minimalist Deterministic Diffusion Model**