Code of AAAI 2024 paper: "PromptMRG: Diagnosis-Driven Prompts for Medical Report Generation".
git clone https://github.com/jhb86253817/PromptMRG.git
conda create -n promptmrg python=3.10
conda activate promptmrg
pip install -r requirements.txt
clip_text_features.json
from here, the extracted text features of the training database via MIMIC pretrained CLIP. Put all these under folder data/mimic_cxr/
.data/iu_xray/
.Moreover, you need to download the chexbert.pth
from here for evaluating clinical efficacy and put it under checkpoints/stanford/chexbert/
.
You will have the following structure:
PromptMRG
|--data
|--mimic_cxr
|--base_probs.json
|--clip_text_features.json
|--mimic_annotation_promptmrg.json
|--images
|--p10
|--p11
...
|--iu_xray
|--iu_annotation_promptmrg.json
|--images
|--CXR1000_IM-0003
|--CXR1001_IM-0004
...
|--checkpoints
|--stanford
|--chexbert
|--chexbert.pth
...
bash train_mimic_cxr.sh
to train a model on MIMIC-CXR.Run bash test_mimic_cxr.sh
to test a trained model on MIMIC-CXR and bash test_iu_xray.sh
for IU-Xray.