-
* [x] Put an X between the brackets on this line if you have done all of the following:
* Checked the online documentation: https://mimic.mit.edu/
* Checked that your issue isn't already add…
-
I was trying to train the "efficientdet_d1_coco17_tpu-32" model on custom dataset using "model_main_tf2.py". My code was working just fine till last month but im encountering this warning now :
"
W…
-
Hello,
I was wondering how exactly you preprocessed the Mimic-CXR Reports.
As you stated in your paper, you partitioned the data by the official split of Mimic-CXR, which resulted in 270790 samp…
-
Goal: Reduce model size and runtime while keeping the accuracy
TODO:
- export to ONNX: accuracy remains the same and runtime is slightly better for 2D but a bit worse for 3D images
- optimize ONNX m…
-
Hi, thank you for your research.
> Bone Suppression Dataset:
The researchers from the Budapest University of Technology and Economics used their in-house clavicle and rib–shadow removal algorithms…
-
Hi,
Thanks for organising this challenge :)
I am having an issue with the dataset when I run the following:
```
from datasets import load_dataset, Sequence, Image, DatasetDict, concatenate_d…
-
Hello, thank you for the wonderful project.
I have a few questions. You mentioned that the training was conducted in two stages, and I'm curious if there is a significant difference in performance c…
-
I encountered a problem while trying to reproduce this project. How should I obtain the --history ../mimic_abn/temporal_ids.json file?
-
Thank you for a very good job! However, I encountered some problems when reproducing the p2p code. Can you show me how to combine dpl+p2p to achieve image editing?
-
I'm working on running the models and I'm not sure about the image normalization. I have a notebook here that loads the MaskedAutoencoderCNN with the densenet121_CXR_0.3M_mae.pth weights and reconstru…