Closed JunMa11 closed 7 months ago
Dear Jun, the export2onnx script is not used for preparing the baseline algorithm container on GrandChallenge. For that we copy over the weights into the docker container during building and then access them via the nnUnet framework at inference time. We are looking into also preparing an onnx version of the model, but that's only to have an additional way to share it. So far it does not work yet, that's something we will get back to after the challenge has concluded.
Dear organizers,
Is the
export2onnx
used in the baseline?https://github.com/MJJdG/ULS23/blob/main/baseline_model/export2onnx.py
I'm asking because our model contains some operators that don't support onnx. Not sure whether this would be an issue during docker submissions.