Optimization-AI / LibAUC

LibAUC: A Deep Learning Library for X-Risk Optimization
https://libauc.org/
MIT License
273 stars 37 forks source link

Training hyperparmeter for CheXpert dataset #20

Closed hankyul2 closed 1 year ago

hankyul2 commented 2 years ago

Hi, Thank you for sharing your great work.

I am wondering if you can provide some missing hyperparameter for training CheXpert dataset. I summarize training hyperparameters explained in 4.2.1. of DAM paper and denote missing values with ? in below table. If you can provide these missing values, I would appreciate it very much.

CrossEntropy optimization setting value
augmentation RandomAffine(rotation=?, translation=?, scaling=?)
criterion CE
optimizer Adam or AdamW (?)
epoch 2
batch 32
lr 1e-5
w.d. 1e-5
AUC Maximization setting value
augmentation RandomAffine(rotation=?, translation=?, scaling=?)
criterion AUCM_MultiLabel
optimizer PESG
epoch 2
batch 32
lr 0.1
scheduler decrease 3 times at 2000, 8000 iter (?)
w.d. 0
gamma choose between (300, 500, 800) (?)

Thank you again!

hankyul2 commented 1 year ago

Hi! I am sorry to ask you without reading your code thoroughly. The augmentation is already applied in dataset shown in below snippets. (link)

def image_augmentation(self, image):
    img_aug = tfs.Compose([tfs.RandomAffine(degrees=(-15, 15), translate=(0.05, 0.05), scale=(0.95, 1.05), fill=128)]) # pytorch 3.7: fillcolor --> fill
    image = img_aug(image)
    return image

Thank you