sammba-mri / sammba-mri

small mammals brain MRI
Other
16 stars 10 forks source link

testing data #106

Open salma1601 opened 6 years ago

salma1601 commented 6 years ago

For the moment, tests are ran on a smoothed anatomical image and a 4D functional with a few time points because we can not put on github heavy data. I propose to have a dataset of inhouse real images of mouse, lemur and rat (we have only one) with anatomical and functional images before any processing, after coregistration, and after normalization to a template. This can be very useful for travis test because we can then test if for a going on PR the results match the expected ones. I propose to make those testing images shared with the proper licence, so that they can be fetched during the test procedure. @nadkarni-na What do you think ? Do you have specific preferences for data that should be included ?

nadkarni-na commented 6 years ago

This was how things were done in the bash/R prototype: always regularly testing on several dozen real datasets to ensure changes worked. This stopped happening after the transfer to github due to slow build and testing times given the integration of travis and coveralls. I don't see what infrastructural changes have been made to solve this problem, but in principle I have always been all for having large test datasets as it is the only way to test correctly.

As for specific data, there's a lot that needs to be included:

for anats to common specifically: PR rats (will need permission as not ours), sawiak Mcb (30), all atlas Mcb (34)

also BECIM mice (most are good and include anat, perf, rs and diffusion)

all 11.7T Mcb data is supposed to go online at some point anyway. the young scans of October 2017 would he a good test set

salma1601 commented 6 years ago

I am trying to have some automatic checks that can be done quickly. For instance is there a way to output the cost that 3dAllineate tries to minimize ?

salma1601 commented 6 years ago

I started looking into BECIM data, and from the 3dinfo history of the normalized anatomical it looks like you used the brain for doing the 3dQwarp. However I had the following comment in my code

the actual T1anat to template registration using the brain extracted image could do in one 3dQwarp step using allineate flags but will separate as 3dAllineate performs well on brain image, and 3dQwarp well on whole head

So I actually always used the brain for allineament and the head for 3dQwarp. In lookking in different projects on the dev branch, it looks like you sometimes choose the head sometimes the brain. So I am a little lost here ...

nadkarni-na commented 6 years ago

Hmmm, any scripts that use the brain for 3dQwarp are either old or wrong. At some point everything should have been standardised to using the brain with 3dAllineate and the head with 3dQwarp. If you find any files that were not produced this way, they are either old or it was a mistake.

nadkarni-na commented 6 years ago

I am trying to have some automatic checks that can be done quickly. For instance is there a way to output the cost that 3dAllineate tries to minimize ?

-savehist sss, -allcost, -allcostX and -allcostX1D p q might be useful

salma1601 commented 6 years ago

Ok thanks !

salma1601 commented 6 years ago

also I found 3dLocalBistat -nbhd 'SPHERE(0.05)' -stat normuti dataset1 dataset2 which will give normalized mutual information between 2 datasets. However I want to give the same neighbourhood as used in the allineate/qwarp steps and I don't know where to get this info. Any idea ?

salma1601 commented 6 years ago

I am testing on mouse data, and I am noting that for 3dQwarp the weight file is never used. Is this something done on purpose and specific to mice ?

nadkarni-na commented 6 years ago

sorry for the delay, had an internet outage yesterday.

if it's not used, it's probably a mistake in that script