annahdo / counterfactuals

Apache License 2.0
12 stars 3 forks source link

Another Toy Example #2

Closed p16i closed 12 months ago

p16i commented 1 year ago

Hi Ann-Kathrin,

I have been reading your for some time. It goes quite slow for me because I lack the knowledge of differential geometry.

The idea of the paper is quite clear from the Helix example already, but I wonder how the eigenvalues of the inverse-induced metric $(\partial_z g) (\partial_z g)^\top$ would look like when we have more noise in the data; in this example, points from the Helix curve are perturbed in directions that are perpendicular to the curve.

With some help from @JanEGerken, I was able to derive the close-forms for those eigenvalues and see how they behave when varying the level of noise.

Nevertheless, I found the derivations quite cumbersome. So , I came up with another toy example where the generative model is assumed to be linear. Although the setup is quite trivial, it may an alternative configure to illustrate the idea of the paper w/o that much familiarity with differential geometry.

You can find my demo at: https://colab.research.google.com/drive/1TsYLr1z5Hs43tvahamKndthcbZzThHPb?usp=sharing.

I would appreciate if you have any comment on this toy setup.

annahdo commented 12 months ago

Hi Pat, sry for the late reply.

Yes I think your setup shows a nice example. And as the inverse-induced metric is constant one can actually get to the counterfactual in one step (assuming you know how large the step should be) which reminds me a bit of this approach, though they find the direction via minimizing a loss function.