Closed xingyu-liu closed 4 years ago
Hey Xingyu,
so far I could not reproduce the error. The script runs fine on my machine. I assume it has something to do with the
export SUBJECTS_DIR=/nfs/s2/userhome/liuxingyu/workingdir/temp/HCP-D/anat ./mris_make_surfaces -aseg ../mri/aseg.presurf -white white.preaparc -noaparc -whiteonly -mgz -T1 brain.finalsurfs HCD0001305_V1_MR lh
to see if this problem is consistent?
Thanks, Leonie
Hi Leonie, thanks for the really quick reply! So I followed your instruction and reran the last command in the recon_surf dir, however, the same error showed up.
writing white matter surface to /nfs/s2/userhome/liuxingyu/workingdir/temp/HCP-D/anat/HCD0001305_V1_MR/surf/lh.white.preaparc...
generating cortex label...
1 non-cortical segments detected
only using segment with 166858 vertices
LabelErode: NULL label
No such file or directory
LabelDilate: NULL label
No such file or directory
Segmentation fault (core dumped)
full log plz see the attached recon-surf-new.log
Thanks!
It could be a few other things:
Is this happening for other subjects also or only for this one?
You can also try to create a docker image and run that if this happens for other cases also.
Hi,
I launched a test subject, and I have the same problems Xingyu is reporting. I also ran the mris_make_surfaces command and found the same errors.
I just checked, and my disk is not full, I'm not sure if there are previous errors, the recon-surf and recon-all logs don't report any problem.
Thank you for your help.
On what OS did you run this? Maybe try docker. We haven't seen this on any of our centos 7 servers.
Hi, Martin:
I'm using Debian testing. Maybe that could be the problem.
I'm building the docker image now. I'll let you know if that works.
Thanks.
Could you check if the surfaces (the white.preaparc and the orig) and the segmentation look ok? And if you by chance have FreeSurfer surfaces for the test subjects, could you replace the FastSurfer ?h.orig with the FreeSurfer ?h.orig and run the command again (just the mris_make_surfaces, all the other files from FastSurfer)? That might help to find out if this is a problem with the surface...
Thank you
Hi, Leonie:
The surface files look fine. I also tried replacing the FastSurfer ?h.orig with the ones I had from FreeSurfer, but I have the same error, it generates the ?h.white.preaparc, but has the NULL in the LabelErode as in the log Xingyu shared. Maybe I'm missing some other files?
This is the list of files I have in the mri folder, not sure if it can be useful for you:
aparc+aseg.orig.mgz aseg.auto.mgz aseg.auto_noCCseg.mgz aseg.presurf.mgz brain.finalsurfs.mgz brain.mgz brainmask.mgz filled-pretess127.mgz filled-pretess255.mgz filled.mgz mask.mgz mri_nu_correct.mni.log mri_nu_correct.mni.log.bak norm.mgz nu.mgz orig/ orig.mgz orig_nu.mgz rawavg.mgz segment.dat tmp/ transforms/ wm.asegedit.mgz wm.mgz wm.seg.mgz
Hi Leonie and Martin, I'm using centos 7 as well and I have enough space in my disk. I also tried for another sub, it ended up with the same error. I checked the the white.preaparc and the orig file, they look good, nothing unusual. I also replaced the lh.orig with freesurfer output, but the error was still there. As for the mri files, I got exactly the same with Alberto, except there was no filled-pretess127.mgz.
Hey you two,
thanks a lot for your help. So it does not seem to be a problem with the surfaces then. That is really odd... I am a bit at a loss why this is happening because it works on my ubuntu as well as centos machine without a problem. There are also no files missing in the mri folder (the filled-pretess127.mgz is not present if you run it in sequential mode + the script crashes already for the first hemisphere). Alberto, do you also get the "1 non-cortical segments detected" message in the recon-surf.log and did the Docker solution work for you?
I also wonder if it has to do with those locale warnings. I don't get those here (also just ran a case and it worked). And these "no such file or directory" would be nice if make_surfaces told us which one it is missing. I will try on a machine in the US with different local settings next.
Is there a way you can share your data, for us to see if we can replicate it? I bet it will run here though.
No, I do not think this is it. On my ubuntu the locale is also not set (I get the same warnings) and it still works.
Just in case, running "locale" returns this for me:
LANG=en_US.UTF-8 LANGUAGE= LC_CTYPE="en_US.UTF-8" LC_NUMERIC=de_DE.UTF-8 LC_TIME=de_DE.UTF-8 LC_COLLATE="en_US.UTF-8" LC_MONETARY=de_DE.UTF-8 LC_MESSAGES="en_US.UTF-8" LC_PAPER=de_DE.UTF-8 LC_NAME=de_DE.UTF-8 LC_ADDRESS=de_DE.UTF-8 LC_TELEPHONE=de_DE.UTF-8 LC_MEASUREMENT=de_DE.UTF-8 LC_IDENTIFICATION=de_DE.UTF-8 LC_ALL=
And just running ./mris_make_surfaces shows the locale errors (same in recon-surf.log if I run the entire subject) FastSurfer/recon_surf$ ./mris_make_surfaces Could not set locale No such file or directory Could not set locale No such file or directory Could not set locale ...
xingyu-li and albertofpena, can you post or send the exact command that you were using? Oh, and are you running the GPU or CPU version ? You can tell if the first segmentation step takes more than 1 minute (like around 10mins).
And xingyu-li can you try passing this path /nfs/e2/workingshop/liuxingyu/temp/HCP-D/anat/ in your call everywhere where the ...workingdir... path is used? For some reason, the scripts switch to that one (maybe to resolve a symbolic link) and it could be that using two different paths that point to the same dir messes things up somewhere.
Hi, the exact command I'm running is
export FREESURFER_HOME=/usr/local/neurosoft/freesurfer6.0.1
source $FREESURFER_HOME/SetUpFreeSurfer.sh
fs_license="$FREESURFER_HOME/license.txt"
data_dir=/nfs/e2/workingshop/liuxingyu/temp/HCP-D
fastsurfer_dir=/nfs/e2/workingshop/liuxingyu/temp/HCP-D/anat
subid="sub-02"
t1w_path="${data_dir}/${subid}/anat/${subid}_T1w.nii.gz"
./run_fastsurfer.sh \
--fs_license "$fs_license" \
--seg "${fastsurfer_dir}/${subid}/aparc.DKTatlas+aseg.deep.mgz" \
--t1 "$t1w_path" \
--sid "$subid" --sd "$fastsurfer_dir" \
--mc --qspec --nofsaparc --threads 4
And as you suggested, I passed the source path instead of the target path of the symbolic link, but the same error was still there.
I'm running a GPU version.
Here's data I use, you guys can download it from links below. raw t1w: https://datapub.fz-juelich.de/studyforrest/studyforrest/structural/sub-02/anat/sub-02_T1w.nii.gz freesurfer: https://datapub.fz-juelich.de/studyforrest/studyforrest/freesurfer/sub-02/
Hi,
I checked the recon-surf.log and I also have the "1 non-cortical segments detected" message.
This was my command:
fastsurferdir=/media/DATOS/afernandez/Proyectos/test_fastsurfer/
nohup ./run_fastsurfer.sh --fs_license $FREESURFER_HOME/license.txt \
--t1 /media/DATOS/afernandez/Proyectos/UKB/1047092/T1_orig_defaced.nii.gz \
--seg $fastsurferdir/1047092/aparc.DKTatlas+aseg.deep.mgz \
--sid 1047092 --sd $fastsurferdir --batch 8 \
--mc --qspec --parallel --threads 4 > test_fastsurfer.log &
The execution of the command reports it's using cuda.
I haven't been able to run the docker yet, because the creation of the docker image took a lot of time in the step of the extraction of FreeSurfer and I lost the ssh connection to the machine. I'm running it again using nohup to prevent this.
Also, these are the versions of the libraries I'm using:
h5py==2.10.0 nibabel==3.1.0 numpy==1.18.5 scikit-image==0.16.2 scipy==1.5.0 torchvision==0.4.2 pytorch==1.3.1
I'm seeing in the requirements file that my versions are newer. If you think this can be a problem, I can try with the recommended ones.
Hey,
so the subject xingyu provided runs through without an error on my machine. However, for me mris_make_surfaces produces "8 non-cortical segments detected" and not 1. I think this is the problem. Question is why this is happening on your machines. Hopefully, the docker will work for you as well.
@xingyu-liu could you maybe share the FastSurfer output you got for the subject with me (as a zip-file if possible)? Then I can check if there are major differences in the produced files.
Thanks a lot.
Best, Leonie
Ok, good news! I was able to reproduce the error by creating a Docker with the library versions @albertofpena posted above. My guess is that nibabel is the problem. I will run some more tests and let you know what comes up.
Hi, Leonie:
I reverted nibabel to the recommended version, and the software is still running, but now it has generated ?h.curv and area files (among others that were missing in the previous attempts). It seems that was the problem.
Sorry for the inconvenience, and thanks for your help!
Great! Happy to know that it is working for you now! :) And thank you for pointing us in the right direction. Your input was really helpful.
Hi! I replicate Alberto's operation and results. Problem solved :) Thanks for all your attention and help!
Glad everything is working smoothly now :)! I also figured out the the underlying problem which was the way the aseg.auto_noCCseg.mgz and mask.mgz were saved with nibabel. I changed the routine with the newest commit. FastSurfer now also works for the newer version of nibabel. I am therefore closing this issue now.
Thank you for your help and happy processing :+1:
Hi! Thank you so much for developing and sharing FastSurfer. We were so exited and tried it out right away. However, the programm exited with an error while running the command 'mris_make_surfaces'. lh.white.preaparc was successfully produced while lh.curv, lh.area and lh.cortex.label failed.
My command:
And the error goes (full log plz see the attached) recon-surf.log:
Could you help us to fix it? Thanks! Best, Xingyu