Open RewaaHummedi opened 1 month ago
Hi,
I was wondering about the same.
The authors have some nice example code snippets in the readme file of the inference folder:
https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/inference/readme.md
There you have several options that are quite useful.
Hello,
I am currently using nnUNet v2 for inference and would like to know if there is a way to perform inference on the fly instead of relying on files. My goal is to directly input data and get predictions in real-time without the intermediate step of writing and reading files.
Questions:
Additional Context:
I am working on a project that requires rapid inference times and minimal latency. Any guidance on how to implement this, including code snippets or pointers to relevant parts of the codebase, would be greatly appreciated. Thank you for your assistance!