Center-of-Imaging-Biomarker-Development / chp_seg

Segmentation tool for the delineation and volumetric quantification of the choroid plexus
5 stars 1 forks source link

serialize_process no such file or directory #2

Open christophergalli opened 4 months ago

christophergalli commented 4 months ago

Dear Kilian,

I was trying to use your image on our linux machines. Via Singularity

Once I have downloaded the kilianhett/chp_seg:1.0.0 and try to run it, it raises a FATAL error telling: command: singularity run chp_seg_1.0.0.sif FATAL: stat /user/chris/docker_images/aschoplex/serialize_process: no such file or directory

I saw in the docker container there is an entrypoint defined with serialize_process yet not available. Do I miss something?

Best wishes, Chris

hettk commented 4 months ago

Dear Chris,

Thank you for using our software.

serialize_process is the default entrypoint and you don't need to specify it in the docker command.

Below is an example of a minimal command you need to use (replace and with the absolute path to the folder where you want to read the image and write the results:

sudo docker run -v :/data/in -v :/data/out kilianhett/chp_seg:1.0.0 --sequence_type T1

Best, Kilian

christophergalli commented 4 months ago

Dear Kilian,

Thank you very much for your response. Is sudo required? Our system does not allow sudo permission for users. I was hoping to build the docker image in the singularity environmenet which gives me a .sif file that can be then used, as again our system does not allow docker.

Might be there an issue there? If you not aware of it but sure it runs on docker I might ask our internal IT administrators to have a look into it. Do not want to bother you too much with this. Just hoped you might know a solution to it.

Best wishes, Chris

hettk commented 4 months ago

Dear Chris,

Unfortunately, Docker requires to be run with sudo permission on Linux systems.

However, you can also ask your IT administrators to add you to the docker group (/etc/group). This would allow you to run docker without the sudo permission.

I hope this helps.

Kind regards, Kilian

dngreve commented 3 months ago

Hi, I'm trying to run this in singularity, but I'm getting the same error that Chris got above:

ls input/ fsm010.nii.gz

singularity run -e -B pwd/input:/data/in -B pwd/output:/data/out chp_seg_1.0.0.sif --sequence_type T1 FATAL: stat /homes/4/greve/serialize_process: no such file or directory

Can you tell me how to run this using singularity? thanks doug

pwighton commented 3 months ago

I got a little further using the following singularity command:

singularity exec \
  --cleanenv \
  --fakeroot \
  --env LD_LIBRARY_PATH=/opt/ANTs/lib: \
  --env PATH=/opt/ANTs/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin \
  -B /path/to/data/input:/data/in \
  -B /path/to/data/output:/data/out \
    chp_seg-1.0.0.sif \
      /app/serialize_process --sequence_type T1

But this generates the following error:

INFO:    Converting SIF file to temporary sandbox...
Number of data to process = 31
['fsm015.norev.mgz', 'fsm023.norev.mgz', 'fsm026.norev.mgz', 'fsm031.norev.mgz', 'fsm033.norev.mgz', 'fsm027.norev.mgz', 'fsm032.norev.mgz', 'fsm013.norev.mgz', 'fsm008.norev.mgz', 'fsm012.norev.mgz', 'fsm034.norev.mgz', 'fsm021.norev.mgz', 'fsm036.norev.mgz', 'fsm016.norev.mgz', 'fsm025.norev.mgz', 'fsm018.norev.mgz', 'fsm022.norev.mgz', 'fsm006.norev.mgz', 'fsm009.norev.mgz', 'fsm011.norev.mgz', 'fsm020.norev.mgz', 'fsm029.norev.mgz', 'fsm007.norev.mgz', 'fsm014.norev.mgz', 'fsm017.norev.mgz', 'fsm038.norev.mgz', 'fsm028.norev.mgz', 'fsm010.norev.mgz', 'fsm039.norev.mgz', 'fsm037.norev.mgz', 'fsm019.norev.mgz']
fsm015_norev
All_Command_lines_OK
Using double precision for computations.
  number of levels = 3
  fixed image: Data/mni_icbm152_t1_tal_nlin_sym_09a.nii
  moving image: /data/out/fsm015_norev/fsm015_norev.nii
 file Data/mni_icbm152_t1_tal_nlin_sym_09a.nii does not exist .
Segmentation fault (core dumped)
All_Command_lines_OK
Using double precision for computations.
Transform file does not exist: /data/out/fsm015_norev/ants0GenericAffine.mat
Can't read initial transform /data/out/fsm015_norev/ants0GenericAffine.mat
All_Command_lines_OK
Using double precision for computations.
Transform file does not exist: /data/out/fsm015_norev/ants0GenericAffine.mat
Can't read initial transform /data/out/fsm015_norev/ants0GenericAffine.mat
 file Data/mni_icbm152_t1_tal_nlin_sym_09a.nii does not exist .
Transform file does not exist: /data/out/fsm015_norev/ants1Warp.nii.gz
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/nibabel/loadsave.py", line 42, in load
    stat_result = os.stat(filename)
FileNotFoundError: [Errno 2] No such file or directory: '/data/out/fsm015_norev/fsm015_norev_mni152.nii.gz'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/serialize_process", line 40, in <module>
    run_segmentation.execute(path_in=path_i, path_out=path_out, path_mdl=os.path.join('module_ai/mdl/'), overwrite=overwrite, seq_type=seq_type)
  File "/app/run_segmentation.py", line 47, in execute
    run_inference.run(path_img=t2mni, anat_prior='Data/cp_skeleton.nii', path_mdl=mdl, path_out=folder)
  File "/app/module_ai/run_inference.py", line 22, in run
    nii = nib.load(path_img)
  File "/usr/local/lib/python3.6/dist-packages/nibabel/loadsave.py", line 44, in load
    raise FileNotFoundError(f"No such file or no access: '{filename}'")
FileNotFoundError: No such file or no access: '/data/out/fsm015_norev/fsm015_norev_mni152.nii.gz'
INFO:    Cleaning up image...