From a debugging session, it appears that maximum_pixel_method_variance is called for each source that is measured using moments only, i.e. where a Gauss fit is not possible - e.g. with too few pixels above the analysis threshold - or fails.
This seems a considerable waste of CPU time since it involves a 2D integral.
From a debugging session, it appears that maximum_pixel_method_variance is called for each source that is measured using moments only, i.e. where a Gauss fit is not possible - e.g. with too few pixels above the analysis threshold - or fails.
This seems a considerable waste of CPU time since it involves a 2D integral.