MIC-DKFZ / nnUNet

Apache License 2.0
5.56k stars 1.7k forks source link

How to Perform Inference on the Fly Instead of Using Files in nnUNet v2? #2359

Open RewaaHummedi opened 1 month ago

RewaaHummedi commented 1 month ago

Hello,

I am currently using nnUNet v2 for inference and would like to know if there is a way to perform inference on the fly instead of relying on files. My goal is to directly input data and get predictions in real-time without the intermediate step of writing and reading files.

Questions:

  1. Is there a built-in method in nnUNet v2 that allows for on-the-fly inference?
  2. If not, what would be the best approach to modify the current prediction pipeline to achieve this?
  3. Are there any examples or documentation available that could guide me through setting up real-time inference with nnUNet v2?

Additional Context:

I am working on a project that requires rapid inference times and minimal latency. Any guidance on how to implement this, including code snippets or pointers to relevant parts of the codebase, would be greatly appreciated. Thank you for your assistance!

rubencardenes commented 1 month ago

Hi, I was wondering about the same.
The authors have some nice example code snippets in the readme file of the inference folder: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/inference/readme.md There you have several options that are quite useful.