Closed pcamach2 closed 11 months ago
google is full of "workarounds" which typically consist of reinstalling numpy etc, so -- could you just try may be using newer image, e.g. docker://nipy/heudiconv:master
which is already 2 months old but may be would have a better luck
I tried building a new Singularity container from the nipy/heudiconv:1.0.0 base plus pip uninstall -y numpy
and pip install numpy
, which changed the numpy version from 1.26.0 to 1.26.2 but did not change the error. No change in behavior from building the latest
or master
images either:
Traceback (most recent call last):
File "/opt/miniconda-py39_4.12.0/bin/heudiconv", line 5, in <module>
from heudiconv.cli.run import main
File "/src/heudiconv/heudiconv/cli/run.py", line 11, in <module>
from ..main import workflow
File "/src/heudiconv/heudiconv/main.py", line 11, in <module>
from .bids import populate_bids_templates, populate_intended_for, tuneup_bids_json_files
File "/src/heudiconv/heudiconv/bids.py", line 21, in <module>
import pydicom as dcm
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/pydicom/__init__.py", line 32, in <module>
from pydicom.dataelem import DataElement
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/pydicom/dataelem.py", line 18, in <module>
from pydicom import config # don't import datetime_conversion directly
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/pydicom/config.py", line 381, in <module>
import pydicom.pixel_data_handlers.pylibjpeg_handler as pylibjpeg_handler # noqa
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/pydicom/pixel_data_handlers/pylibjpeg_handler.py", line 82, in <module>
import libjpeg
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/libjpeg/__init__.py", line 4, in <module>
from .utils import decode, decode_pixel_data, get_parameters
File "/opt/miniconda-py39_4.12.0/lib/python3.9/site-packages/libjpeg/utils.py", line 9, in <module>
import _libjpeg
File "libjpeg/_libjpeg.pyx", line 1, in init libjpeg._libjpeg
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 80 from PyObject
thanks for trying
pip uninstall -y numpy
andpip install numpy
, which changed the numpy version from 1.26.0 to 1.26.2 but did not change the error
might not be of desired effect since original installation via conda install
.
before below: since I do not see you using --no-home
-- try with that option -- may be you have some local numpy
installed in some ~/.local/lib/python*
and that interferes...?
BTW -- we have singularity images pre-built/reshared from http://datasets.datalad.org/?dir=/repronim/containers/images/nipy , so there is http://datasets.datalad.org/repronim/containers/images/nipy/nipy-heudiconv--1.0.0.sing .
In your case -- do you see the issue you are talking about if you just run smth like
singularity run ${IMAGEDIR}/heudiconv-v1.0.0.sif --help
or
$> singularity exec nipy-heudiconv--1.0.0.sing /opt/miniconda-py39_4.12.0/bin/python -c 'import pydicom;print("ok")'
ok
(just with your image)? if works ok, try also while adding all the bind mounts etc options -- may be somehow one of them adds side effect? try with the image I gave.
Thank you for your suggestions!
The error is indeed thrown with running:
In your case -- do you see the issue you are talking about if you just run smth like
singularity run --cleanenv --no-home ${IMAGEDIR}/heudiconv-11272023-master.sif --help
This happens for both the provided 1.0.0.sing image and those that I built as described earlier. However, adding --contain
or --no-home
to the Singularity run command resolves the error:
$ SINGULARITY_CACHEDIR=$CACHESING SINGULARITY_TMPDIR=$TMPSING \
singularity run --cleanenv --no-home --bind ${projDir}:/datain ${IMAGEDIR}/heudiconv-11272023-master..sif \
-d /datain/{subject}/{session}/*/SCANS/*/DICOM/*dcm -f /datain/${project}_heuristic.py \
-o /datain/bids/sourcedata -s ${sub} -ss ${ses} -c dcm2niix -b \
--minmeta --overwrite -g accession_number
INFO: Running heudiconv version 1.0.0.post6+g5374088 latest 1.0.0
INFO: Need to process 1 study sessions
INFO: PROCESSING STARTS: {'subject': 'SAY760', 'outdir': '/datain/bids/sourcedata/', 'session': 'B'}
INFO: Processing 11546 dicoms
INFO: Analyzing 11546 dicoms
Summary
I am trying out the new version 1.0.0 release using Singularity and some DICOMs that I have successfully converted using older releases of Heudiconv (v0.11.3, v0.9.0). Local Singularity cache ($CACHESING) and temp ($TMPSING) directories are used.
The project heuristic is:
The command used was:
This fails as follows:
Platform details:
Choose one:
Singularity container built using Singularity v3.8.7 using the command
singularity build --fakeroot heudiconv-v1.0.0.sif docker://nipy/heudiconv:1.0.0