MIC-DKFZ / nnUNet

Apache License 2.0
5.59k stars 1.71k forks source link

Fatal FIPS Failure #2430

Open vmiller987 opened 1 month ago

vmiller987 commented 1 month ago

My work is interested in trying nnUNet models and seeing how they perform for what we require. We are required to be FIPS compliant. I can not get past the plan_and_preprocess command without recieving the FATAL FIPS SELFTEST FAILURE. I have attempted to use v1 and v2 for this and run into the same issue. One of the thoughts is the issue has to do with the just-in-time compiler using md5 hash which is not FIPS compliant.

I have attempted to follow this v1 example: https://github.com/MIC-DKFZ/nnUNet/blob/nnunetv1/documentation/training_example_Hippocampus.md

I have been following the documentation for v2 using the same Hippocampus dataset.

Is there a way around this? Is there anything else you require from me?

Thanks for your assistance. It is greatly appreciated.

Edit: I've given myself a sterile python environment. Updated photo as the deprecation warnings are not related. Code_hsa0nzS10j

vmiller987 commented 1 month ago

I had some time this morning to work through this issue a bit. I think the issue might have to do with nnUNet being compiled on a FIPS disabled machine. (PyInstaller can cause this) I'm required to use FIPS enabled for work, so when I try to run something compiled on a FIPS disabled machine, I think this error happens. I'm honestly not 100% sure. I could be wrong.

However, I cloned nnUNetv2 and created a fresh python environment. I can't use it as a module without getting the FIPS error. I was able to get it working on the Hippocampus data by accessing it as normal code and not a library.

I created a jupyter notebook and ran extract_fingerprint_dataset, plan_experiment_dataset, preprocess_dataset, and run_training manually.

I had to add torch._dynamo.config.suppress_errors = True to a few files and update self.network = DDP(self.network, device_ids=[self.local_rank], find_unused_parameters=True) to include find_unused_parameters.

I'm unsure if this was the ideal solution, and it could be a skill issue as I am a novice. I'm waiting for the Hippocampus dataset to finish, and then I will try to adapt our dataset. I'm excited to try and learn more about nnUNet. Thanks for taking the time to read this issue, but you might be able to close it.