MIC-DKFZ / nnUNet

Apache License 2.0
5.98k stars 1.78k forks source link

ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject #2308

Open x0204 opened 5 months ago

x0204 commented 5 months ago

Dear nnU-Net Developer,

Thank you for your outstanding work on nnU-Net.

I have recently encountered an error while running inference. Below are the command and the corresponding error message:

nnUNetv2_predict -d Dataset199_pre_with_T2 -i /home/chang/GBM_Data/GBM_original/20240618/04_nnunet -o /home/chang/nnUNet/nnUNetFrame/DATASET/nnUNet_predict_result/Dataset199_pre_with_T2 -f 0 1 2 3 4 -tr nnUNetTrainer -c 3d_fullres -p nnUNetPlans Traceback (most recent call last): File "/home/chang/anaconda3/envs/GBM/bin/nnUNetv2_predict", line 5, in from nnunetv2.inference.predict_from_raw_data import predict_entry_point File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 22, in from nnunetv2.inference.data_iterators import PreprocessAdapterFromNpy, preprocessing_iterator_fromfiles, \ File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/nnunetv2/inference/data_iterators.py", line 12, in from nnunetv2.preprocessing.preprocessors.default_preprocessor import DefaultPreprocessor File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 26, in from nnunetv2.preprocessing.resampling.default_resampling import compute_new_shape File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/nnunetv2/preprocessing/resampling/default_resampling.py", line 5, in import pandas as pd File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/pandas/init.py", line 49, in from pandas.core.api import ( File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/pandas/core/api.py", line 1, in from pandas._libs import ( File "/home/chang/anaconda3/envs/GBM/lib/python3.11/site-packages/pandas/_libs/init.py", line 18, in from pandas._libs.interval import Interval File "interval.pyx", line 1, in init pandas._libs.interval ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

Could you please provide any insights or suggestions to resolve this issue? My numpy version is 1.26.4 and my panda version is 2.2.1. Thank you in advance for your assistance.

denbonte commented 5 months ago

Hey @x0204,

Sorry to intervene - I found this issue by googling the same error, and hopefully, this might help you or @FabianIsensee and the team figuring the issue out (though it's clear it's "just" a dependency issue).

Traceback (most recent call last):
  File ".../bin/nnUNetv2_predict", line 5, in <module>
    from nnunetv2.inference.predict_from_raw_data import predict_entry_point
  File ".../lib/python3.9/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 22, in <module>
    from nnunetv2.inference.data_iterators import PreprocessAdapterFromNpy, preprocessing_iterator_fromfiles, \
  File ".../lib/python3.9/site-packages/nnunetv2/inference/data_iterators.py", line 12, in <module>
    from nnunetv2.preprocessing.preprocessors.default_preprocessor import DefaultPreprocessor
  File ".../lib/python3.9/site-packages/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 26, in <module>
    from nnunetv2.preprocessing.resampling.default_resampling import compute_new_shape
  File ".../lib/python3.9/site-packages/nnunetv2/preprocessing/resampling/default_resampling.py", line 7, in <module>
    from batchgenerators.augmentations.utils import resize_segmentation
  File ".../lib/python3.9/site-packages/batchgenerators/augmentations/utils.py", line 22, in <module>
    from skimage.transform import resize
  File ".../lib/python3.9/site-packages/skimage/__init__.py", line 122, in <module>
    from ._shared import geometry
  File "geometry.pyx", line 1, in init skimage._shared.geometry
ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

(in the snippet above, I changed the path with ... because I am running this on a cluster, and I am not sure I should be sharing paths/names info - for the rest, it's a copy-paste).

In my case, I am running nnunetv2==2.4.2 in multiple conda environments. I noticed a slightly older conda environment I had set up a couple of weeks ago had numpy==1.26.4 installed. Everything is working there. On the other hand, I get the aforementioned error when running nnunetv2==2.4.2 in a newer conda environment I set up just a few days ago, which has numpy==2.0.0 installed.

All I had to do was to pip install numpy==1.26.4 in the newer conda environment, and nnunetv2 is back working like a charm (I tested this on both an A100 and a A5000). For reference, both my conda environments are running Python 3.9.19.

Cheers, Dennis.

LalithShiyam commented 5 months ago

LOL - I rushed to report this as well, as our entire stack crashed 👍🏾 due to numpy 2.0!

x0204 commented 5 months ago

Thank you for your assistance.

After installing nnunetv2 in a new environment, I successfully ran the inference. However, I subsequently encountered an error during the inference process, which prevented further progress.

The error details are as follows: Traceback (most recent call last): File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/resource_sharer.py", line 138, in _serve with self._listener.accept() as conn: ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 483, in accept answer_challenge(c, self._authkey) File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 953, in answer_challenge message = connection.recv_bytes(256) # reject large message ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 216, in recv_bytes buf = self._recv_bytes(maxlength) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 430, in _recv_bytes buf = self._recv(4) ^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 395, in _recv chunk = read(handle, remaining) ^^^^^^^^^^^^^^^^^^^^^^^ ConnectionResetError: [Errno 104] Connection reset by peer Traceback (most recent call last): File "/home/chang/anaconda3/envs/GBM/bin/nnUNetv2_predict", line 8, in sys.exit(predict_entry_point()) ^^^^^^^^^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 864, in predict_entry_point predictor.predict_from_files(args.i, args.o, save_probabilities=args.save_probabilities, File "/home/chang/anaconda3/envs/GBM/lib/python3.12/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 256, in predict_from_files return self.predict_from_data_iterator(data_iterator, save_probabilities, num_processes_segmentation_export) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/chang/anaconda3/envs/GBM/lib/python3.12/site-packages/nnunetv2/inference/predict_from_raw_data.py", line 349, in predict_from_data_iterator for preprocessed in data_iterator: File "/home/chang/anaconda3/envs/GBM/lib/python3.12/site-packages/nnunetv2/inference/data_iterators.py", line 111, in preprocessing_iterator_fromfiles raise RuntimeError('Background workers died. Look for the error message further up! If there is ' RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!

Could you please provide any insights or suggestions regarding this issue? If there are any mistakes in my procedure, kindly let me know.

Thank you in advance for your assistance!

denbonte commented 5 months ago

Hey @x0204,

This looks like a totally different error, and it might be down to:

File "/home/chang/anaconda3/envs/GBM/lib/python3.12/multiprocessing/connection.py", line 395, in _recv
chunk = read(handle, remaining)
^^^^^^^^^^^^^^^^^^^^^^^
ConnectionResetError: [Errno 104] Connection reset by peer

Or:

RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!

But I am not able to reproduce this (it works for me with numpy==1.26.4).

tomaroberts commented 4 months ago

Hi all,

Can confirm I'm also getting a similar error. Just throwing in some more logs in case useful.

Python 3.9.7 numpy 1.26.4 MacOS Sonoma 14.5

Below from when running nnUNetV2_plan_and_preprocess:

Traceback (most recent call last): File "/Users/tr17/.pyenv/versions/3.9.7/bin/nnUNetv2_plan_and_preprocess", line 5, in from nnunetv2.experiment_planning.plan_and_preprocess_entrypoints import plan_and_preprocess_entry File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/nnunetv2/experiment_planning/plan_and_preprocess_entrypoints.py", line 2, in from nnunetv2.experiment_planning.plan_and_preprocess_api import extract_fingerprints, plan_experiments, preprocess File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/nnunetv2/experiment_planning/plan_and_preprocess_api.py", line 8, in from nnunetv2.experiment_planning.dataset_fingerprint.fingerprint_extractor import DatasetFingerprintExtractor File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/nnunetv2/experiment_planning/dataset_fingerprint/fingerprint_extractor.py", line 11, in from nnunetv2.imageio.reader_writer_registry import determine_reader_writer_from_dataset_json File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/nnunetv2/imageio/reader_writer_registry.py", line 7, in from nnunetv2.imageio.natural_image_reader_writer import NaturalImage2DIO File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/nnunetv2/imageio/natural_image_reader_writer.py", line 19, in from skimage import io File "/Users/tr17/.pyenv/versions/3.9.7/lib/python3.9/site-packages/skimage/init.py", line 122, in from ._shared import geometry File "geometry.pyx", line 1, in init skimage._shared.geometry ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject CompletedProcess(args=['nnUNetv2_plan_and_preprocess', '-d', '001', '--verify_dataset_integrity'], returncode=1)

from importlib.metadata import version print("nnunetv2 version:", version('nnunetv2')) print("numpy version:", version('numpy')) print("pandas version", version('pandas')) print("scikit-image version", version('scikit-image')) nnunetv2: version 2.4.2 numpy version: 1.26.4 pandas version 2.2.2 scikit-image version: 0.24.0

I've also tried force reinstalling numpy 1.26.4 and also using 1.26.3, but to no avail.