-
哥们,你这硬件也太猛了,想请问下你的训练时间?
-
### Describe the bug
When the image size is larger than the `patch_size`, GaNDLF tries to extract multiple patches and perform classification for each, after which it takes an average to generate the…
-
Hello:
I use the 3DUnet model to do the Brain Tumor segmentation task ,and the training set is the MICCAI BraTS 2018 data. When I use the 'python3 net_run.py inference -c ./config/my_unet_config.ini…
-
- [ ] Add images to the ones missing
- [x] Add new publications (CIKM, MICCAI, WACV)
- [x] Fix titles, e.g., "SAUCE" should be "SAUCE: Truncated Sparse Document Signature Bit-Vectors for Fast Web-Sc…
-
Hello,
I am trying to calculate scores for the various techniques (shown below is for openhands) and have written the code shown below. I was wondering if I am calculating the scores correctly, and…
-
Hi Sara,
I would like to know the directory structure for the BRATS dataset .. I mean how do you actually run it .Please suggest some pointers on this.
-
here are a few missing ones as reported by Fatameh:
unknown journal: {'nlmid': '0400722', 'medlineAbbreviation': 'Med Arh', 'isoabbreviation': 'Med Arh', 'title': 'Medicinski arhiv'}
unknown journ…
-
Thank you for your appreciative work~
When your paper was early accepted by MICCAI 2024, the comparison method "Hermes" had not yet been open-sourced.
So I wonder where the comparison results of "He…
-
https://www.zhihu.com/topic/19560026/top-answers
Transformer是谷歌在17年做机器翻译任务的“Attention is all you need”的论文中提出的,引起了相当大的反响。 每一位从事NLP研发的同仁都应该透彻搞明白Transformer,它的重要性毫无疑问,尤其是你在看完我这篇文章之后,我相信你的紧迫感会更迫切,我就是这么…
-
Dear @nickk124
First of all, congratulations on your work for being accepted by MICCAI 2024! This is a very wonderful work by your team.
I am trying to train my custom segmentation-guided model u…