We are delighted to inform everyone that our paper has been successfully accepted by IEEE Transactions on Geoscience and Remote Sensing (TGRS 2024). Paper Link
We are delighted to inform you that based on our proposed UANet, we have made improvements and successfully secured the double track championship in the 2024 IEEE GRSS Data Fusion Contest.
The results on the three building datasets can be downloaded via Baidu Disk:Link Code:UANE
We have released the codes of our UANet based on four backbones (VGG, ResNet50, Res2Net-50, and PVT-v2-b2).
The whole training and testing framework of the paper have been released!
Our framwork has been deployed into application:
We have provided the pretrained backbones(ResNet-50, Res2Net-50, PVT-v2-b2)
You can download via Baidu Disk Link Code:abmg
To train the UANet model, follow these steps:
CUDA_VISIBLE_DEVICES=0 python Code/train.py -c config/whubuilding/UANet.py
CUDA_VISIBLE_DEVICES=0 python Code/test.py -c config/whubuilding/UANet.py -o test_results/whubuilding/UANet/ --rgb
To perform testing with Test Time Augmentation (TTA), follow these steps:
python Code/test.py -c config/whubuilding/UANet.py -o test_results/whubuilding/UANet/ -t lr --rgb
If you want to continue training the model from a checkpoint or perform multiple training sessions, follow this:
Our data processing and whole framework are based on the BuildFormer. Here, we sincerely express our gratitude to the authors of that paper.
We appreciate your attention to our work!
@ARTICLE{10418227,
author={Li, Jiepan and He, Wei and Cao, Weinan and Zhang, Liangpei and Zhang, Hongyan},
journal={IEEE Transactions on Geoscience and Remote Sensing},
title={UANet: An Uncertainty-Aware Network for Building Extraction From Remote Sensing Images},
year={2024},
volume={62},
number={},
pages={1-13},
keywords={Feature extraction;Uncertainty;Buildings;Data mining;Decoding;Remote sensing;Deep learning;Building extraction;remote sensing (RS);uncertainty-aware},
doi={10.1109/TGRS.2024.3361211}}