An official code for EH-former (submitted to Information Fusion)
Please download the datasets through:
The project should be finally organized as follows:
./EH-Former/
├── data/
├── train_image/
├── train_label/
├── test_image/
├── test_label/
├── loss_metrics.py
├── model/
├── dataloader.py
├── main.py
......
Please download the weights training on UDIAT, BUSI, SYSU through link.
Please load_weight as follows:
checkpoint = torch.load(load_path + '/seg_weights.pth', map_location=device)
net.load_state_dict(checkpoint['model_state_dict'], strict=True)
set_params_recursive(net, checkpoint['alpha'])
Train
python main.py --train_mode=True --gpu 0 --Cnet_path='your stage1 network path'
Test
python main.py --train_mode=False --test_mode=True --gpu 0
If you use this code, please cite following paper, thanks.
@article{qu2024eh,
title={EH-Former: Regional Easy-Hard-Aware Transformer for Breast Lesion Segmentation in Ultrasound Images},
author={Xiaolei Qu, Jiale Zhou, Jue Jiang, Wenhan Wang, Haoran Wang, Shuai Wang, Wenzhong Tang, Xun Lin},
journal={Information Fusion},
year={2024},
}