Closed bcipolli closed 8 years ago
I think auto-reject is doing a good job, but for some images it's not clear to me (simply from visual inspection) why they are being rejected. For example Xs in this set.
Maybe the low SDs are driven by values outside the brain? Maybe we need to apply masks to see properties in the grey matter mask.
Actually I think it may be a normalization issue. The images may have very different scale.
I found one unchecked parcellated image here (13914). We can try increasing the unique value threshold.
@atsuch good point on the normalization. I do remove pixels with a value of zero, but I don't normalize. I'll think about that.
Re: image 13914: it's not in the list of images checked by the algorithm on my machine, weird:
013901 ( KEEP ): 2.0789 std, 39158 unique values
013915 ( KEEP ): 0.8309 std, 163925 unique values
013920 (REJECT): 0.0000 std, 1 unique values
I will look into that too.
Ah, I had fixed the use of "bad_collects". If I remove that, and try to reject by data itself, I get:
013914 ( KEEP ): 389.3619 std, 1330 unique values
This builds from #39 and tries to address #40. It tries to use basic summary stats (std, # unique values) to detect undesirable images.
Currently, the code doesn't reject the images; instead, it outputs to the console the stats, whether the image is rejected, then marks the image with an "X" in the title of the QC output images.
We should review those images and adjust parameters as desired; if we find good parameters, we can do the rejection in the code.