chromatix-team / chromatix

Differentiable wave optics using JAX! Documentation can be found at https://chromatix.readthedocs.io
MIT License
77 stars 13 forks source link

Update propagation with propagator caching #80

Closed diptodip closed 1 year ago

diptodip commented 1 year ago

We can now compute propagators for Fresnel transfer and exact transfer propagation. We've also introduced a new function kernel_propagate which performs the convolution of a given propagation kernel and a Field and handles the padding and cropping appropriately. Internally, transfer function propagation functions just compute the appropriate propagation kernel (we call these kernels propagators) and then call kernel_propagate.

This also means that the Propagate element can now cache the propagation kernel for transfer function propagations, which is now the default behavior. We also made the change that the Propagate element only takes a Field as input, not a z distance to propagate. This means that we can place Propagate between other optical elements in an OpticalSystem, which is the intended use. Since Propagate can optionally have z and n trainable, we check if the user intends to train these variables and if the user also wanted to cache the propagation kernel, we raise an error. We can't cache a propagation kernel and also train the z or n, because then we end up recomputing the propagation kernel every time we propagate anyway. If the user is not caching anything, then the behavior of Propagate just recomputes everything when it is called.

Finally, since Propagate now has a lot going on (trainable propagation distance and refractive index, propagator caching), we've added KernelPropagate which can take either a propagation kernel or a function to initialize a propagation kernel. If this kernel or function is marked as trainable with chromatix.utils.trainable, then this propagation kernel can be optimized pixel by pixel. This allows for optimizing a kernel to better match non-ideal propagation in the real world based on camera measurements, for example.

We've also updated the CGH demo notebook to show how propagation works. The DMD notebook is updated as well, but nothing has changed in the way the typical functional propagation calls work so nothing major changed in that one.