Closed ShChen233 closed 5 months ago
Sorry, right now we have tested only on Synapse, PROMISE12 and LA datasets. However, based on the performance on these 3 datasets, it is not reasonable to witness such a performance drop on AMOS. We can discuss to check on the training details.
As shown in the picture,I experimented on the AMOS dataset, but the performance of the test results was poor. My network structure configuration is as follows. Have you done any experiments on the AMOS dataset? root_path: ../../MEDICAL_DATASET/amos22 output: ./output dataset: amos split: train num_classes: 13 max_iterations: 30000 max_epochs: 300 stop_epoch: 300 batch_size: 12 n_gpu: 3 deterministic: 1 base_lr: 0.005 img_size: 512 seed: 2345 vit_name: vit_b ckpt: ../SAM-Adapter-PyTorch-main/SAM-Adapter-PyTorch-main/pretrained/sam_vit_b_01ec64.pth lora_ckpt: None rank: 5 warmup: True warmup_period: 250 AdamW: True module: sam_lora_image_encoder dice_param: 0.9 is_pretrain: False exp: amos_512