Closed weiyaw closed 1 month ago
Thanks! I've implemented this and made a pull request https://github.com/danielward27/flowjax/pull/170. It would be great if you could give it a quick look over if you have the time. I do wonder if a more general implementation is possible to support more (non-transcendental) activation functions, but I think what I have is probably OK unless you have any suggestions.
Thank you for the very quick response and implementation! I didn't have the chance to run it yet, but I have done the math and most of them agree with my calculation except the log-determinant. Could you please double-check?
Great thanks for the check and spotting that! The tests only led to testing in the positive leaky relu domain, where the mistake didn't show. I'll fix the mistake and add a test for the negative case.
Implemented in https://github.com/danielward27/flowjax/pull/170. Thanks for your help!
Thanks for creating flowjax! This is a godsend for my research.
I'm wondering if there is a plan to implement
inverse
inPlanar
in the near future? I understand that there's no closed-form equation when using tanh activation. Perhaps we can have a version with LeakyReLU, e.g. https://github.com/VincentStimper/normalizing-flows/blob/master/normflows/flows/planar.py ?