-
I first downloaded the dataset of B T C V, converted it to nnUNET format and preprocessed it, in addition to this error during training, what should I do, what is the problem.
/opt/conda/conda-bld/…
-
Great job!When the dataset will be release :)
-
Hello, your work is so great!!
i ve already got the results on brats2020,now iam work on my own datasets,but my dataset only have 1 modality (t1),i want to know where can i change the input channel f…
-
你好
我在做訓練模型的時候用的是BTCV的dataset,
max_epoch=2000
store_num=50
warmup_epoch=100
最後再做test的時候出來的效果很差
Spleen: dice 0.0069, recall 0.9436, precision 0.0035.
Right Kidney: dice 0.0000, recall 0.0000, pre…
-
Hi, super interesting and useful work, thank you!
I was surprised to see basically no difference between from-scratch and SSL weights of the SwinUNETR org. implementation.
Did you ensure t…
-
Hi, Thanks for sharing! I noticed that you provided two pre-trained models for download. So regarding nnUnet, will pre-training models be provided? I saw in your article that you used three models inc…
-
Hi, from the dataset list in `PAOT.txt` I noticed that for BTCV dataset, you provided directories for the labels that corresponds to the testing subjects (img0061 to img0080). Can I check if these lab…
-
Dear @seconds7 ,
Thank you for sharing your work.
I have question about how to train the network with multiple dataset. (It's not written yet on that section)
Currently I have KiTS, LITS, and D…
-
Hello, may I ask how many epochs you have trained and the dice loss value at the final convergence
When I trained using three datasets (liver, kidney, and tumor), 04-LITS, 05KITS, and 10-3liver, I …
-
Hi !I found a problem when I was running that code, in Pretrain/utils/dataset_in_memory.py you didn't change the BTCV to WORD, hopefully you can correct that!