LouisDesdoigts / dLux

Differentiable optical models as parameterised neural networks in Jax using Zodiax
https://louisdesdoigts.github.io/dLux/
BSD 3-Clause "New" or "Revised" License
43 stars 6 forks source link

Jitter Classes (#249) #253

Open maxecharles opened 9 months ago

maxecharles commented 9 months ago

PR to track changes.

Adding the ApplyAsymmetricJitter class I wrote for TOLIMAN, parametrised by r, phi, and shear. I basically completely rewrote the generate_kernel method in a way that seems a bit more intuitive to me:

    def generate_kernel(self, pixel_scale: float) -> Array:

        ...

        # Generate distribution
        extent = pixel_scale * self.kernel_size  # kernel size in arcseconds
        x = np.linspace(0, extent, self.kernel_size) - 0.5 * extent
        xs, ys = np.meshgrid(x, x)
        pos = np.dstack((xs, ys))

        kernel = multivariate_normal.pdf(
            pos, mean=np.array([0.0, 0.0]), cov=self.covariance_matrix
        )

        return kernel / np.sum(kernel)

I kinda reimplemented this way of generating the kernel into the original ApplyJitter detector class.

surely i become a real dLux contributor SURELY

maxecharles commented 9 months ago

also i have NOT tested the new ApplyJitter

maxecharles commented 9 months ago

Spoke with @LouisDesdoigts and we (he) decided that the ApplyJitter class was made entirely redundant by the new ApplyAsymmetricJitter class, so we have now replaced ApplyJitter with ApplyAsymmetricJitter. So there's no more ApplyAsymmetricJitter anymore but now ApplyJitter is ApplyAsymmetricJitter if you get what I'm saying

benjaminpope commented 9 months ago

Could even just rename the asymmetric jitter class to the new ApplyJitter...

LouisDesdoigts commented 9 months ago

Yeah @maxecharles made a typo in his comment (GOT HIM), its just going to be called ApplyJitter