Open takluyver opened 4 years ago
That's a very good idea. A threshold depends on the typical distribution for images. Do you know roughly what a intensity pdf would look like? The 90th percentile is always a good idea for starters. The question here would be I you want to calculate this per image or over a larger number, say a train.
From some quick experiments on my laptop, anything based on percentiles or medians will have to be on a single image (or a roughly similar amount of data), otherwise it's too slow.
The test data in this repository is an easy starting point - though there may well have been changes to the detector calibration since then. Here's the distribution for AGIPD (note log-scale for y):
Here's the 0-20k section of that:
AGIPD median: 149; 90th percentile: 483; 95th: 638; 99th: 1139
And here's LPD, which doesn't seem to have the same extreme outliers:
LPD median: 889, 90th percentile: 1580; 95th: 1677; 99th: 1819
The GUI was previously scaling the colour scale for the image view based on the maximum pixel value - which is typically some massive value caused by a strange error, resulting in the real image being squashed into darkness. I hardcoded a value of 10k to improve on this, which worked for some data, but is still way too high for other examples - and it could well be too low for others.
Applying the mask from the calibrated data files might help with this. But it might still be necessary to do something like scaling to the 90th decile, or 10x the median, in case the mask misses something.