WCHN / CTseg

Brain CT image segmentation, normalisation, skull-stripping and total brain/intracranial volume computation.
GNU General Public License v3.0
55 stars 17 forks source link

Error in running CT-SEG (Docker Image) #17

Open arturjpaulo opened 2 years ago

arturjpaulo commented 2 years ago

I have been facing a problem when I am ruining CT-SEG using Docker image: The 'temp' files are created, but process stops quickly and the segmentation files are not generated.

For exemple, when I enter:

docker run --rm -it -v dir_host:/data ubuntu:ctseg eval "spm_CTseg('/data/CT.nii',ct_result',true,true,true,true,1,2,0.0005)"

I realized that the process stops when it reaches 15.5GB of memory. Do you know if there is a way to limit or parallelize this processes within the Dockerfile , so it will not stop when it attempts to reaches the full RAM memory?

MicrosoftTeams-image (1)

MicrosoftTeams-image

brudfors commented 2 years ago

Hello @arturjpaulo

I suspect you are giving CTseg a large image, which means that the RAM usage will be high. Unfortuntaly, there are no tricks available for decreasing memory use on the level of calling the algorithm (i.e., your docker run command). If you cannot increase the RAM, you could have a look at the utility function:

https://github.com/WCHN/CTseg/blob/08161dc2d1dbfab8c4963808ca1626b4e6d77ccf/spm_CTseg_util.m#L14

It allows you to downsample an image without breaking the affine matrix in the nifti header. For example, you could try setting the voxel size to 1 mm isotropic.