danielward27 / flowjax

https://danielward27.github.io/flowjax/
MIT License
82 stars 10 forks source link

Implementing inverse in Planar flow #169

Closed weiyaw closed 1 month ago

weiyaw commented 1 month ago

Thanks for creating flowjax! This is a godsend for my research.

I'm wondering if there is a plan to implement inverse in Planar in the near future? I understand that there's no closed-form equation when using tanh activation. Perhaps we can have a version with LeakyReLU, e.g. https://github.com/VincentStimper/normalizing-flows/blob/master/normflows/flows/planar.py ?

danielward27 commented 1 month ago

Thanks! I've implemented this and made a pull request https://github.com/danielward27/flowjax/pull/170. It would be great if you could give it a quick look over if you have the time. I do wonder if a more general implementation is possible to support more (non-transcendental) activation functions, but I think what I have is probably OK unless you have any suggestions.

weiyaw commented 1 month ago

Thank you for the very quick response and implementation! I didn't have the chance to run it yet, but I have done the math and most of them agree with my calculation except the log-determinant. Could you please double-check?

danielward27 commented 1 month ago

Great thanks for the check and spotting that! The tests only led to testing in the positive leaky relu domain, where the mistake didn't show. I'll fix the mistake and add a test for the negative case.

danielward27 commented 1 month ago

Implemented in https://github.com/danielward27/flowjax/pull/170. Thanks for your help!