YAN-0802 / LAHNet

A Lightweight Network for Iris Segmentation and Localization
0 stars 1 forks source link

Iris-LAHNet: A Lightweight Attention-guided High-resolution Network for Iris Segmentation and Localization

Introduction

This is the code for Iris-LAHNet. Iris-LAHNet is composed of a stem, a basic backbone, Pyramid Dilated Convolution (PDC) blocks, Cascaded Attention-guided Feature Fusion Module (C-AGFM), and auxiliary heads. The basic backbone is a tiny high-resolution network. The introduction of PDC blocks and C-AGFM helps to extract multi-scale features from multi-resolution images and reduce noise. In addition, we introduce three auxiliary heads with edge heatmaps, which output auxiliary loss to help model training and enhance attention to single pixels of the edge. It helps to compensate for the neglect of localization tasks during multi-task training. Experiments on four datasets show that our model achieves the lightest while ensuring segmentation and localization results.

pic1

Datasets and Main Results

We compare our model with other state-of-the-art methods on four public datasets. Refer to IrisParseNet for related settings.

Result on CASIA-Iris-Distance

Method Segmentation Localization Params(M) FLOPs(G)
E1(%) mIoU(%) F1(%) Inner(%) Outer(%) mHdis(%)
RTV-L1 0.68 78.25 87.55 0.7046 1.2457 0.9751 \ \
MFCNs 0.59 \ 93.09 \ \ \ 21.68 156.35
U-Net 0.56 \ \ 0.6129 1.1478 0.8804 31.06 225.94
CNNHT 0.56 86.58 92.27 1.1973 2.0251 1.6112 61.87 144.79
IrisParseNet 0.41 89.53 94.25 0.6874 0.8662 0.7768 31.68 263.56
HTU-Net 0.43 \ \ 0.5381 0.9702 0.7541 22.27 239.74
Iris-LAHNet 0.36 90.78 95.15 0.4915 0.8990 0.6953 0.27 5.57

Result on MICHE-I

pic2

Result on UBIRIS.v2

pic3

Result on CASIA-Iris-Mobile-V1.0

pic4

Requirements

Note the version number is for reference only, you can choose the version number that is suitable for your system.

Quick Start

Prepare the data

Thanks for the selfless contribution of previous work, we adopt the dataset provided in IrisParseNet for related settings.

Prepare the models

Iirs-LAHNet trains the model on different datasets. All the model weights are saved in the logging.pth file under the ./LAHNet/experiments folder.

Let's take training or testing on MICHE as an example.

Training

You can run training as following:

CUDA_VISIBLE_DEVICES=gpuid python train.py

Note: You need to change the associated path in train.py to your own path. The trained model is saved in the appropriate ./~/checkpoints/ folder in the ./LAHNet/experiments/ folder. Of course, you can also change the location and name of the save.

Test

You can run testing as following:

CUDA_VISIBLE_DEVICES=gpuid python test.py

Note: You need to change the associated path in test.py to your own path. The results will be saved in the appropriate ./~/checkpoints/ folder in the ./LAHNet/experiments/ folder. Of course, you can also change the location and name of the save.

Post-processing

Finally, the output can be visualized:

CUDA_VISIBLE_DEVICES=gpuid python postprocess.py

This step can be ignored because important results have been saved in the checkpoints folder after running test.py.

If you want to experiment with other datasets, just change the name of the dataset in the .py file.

Acknowledgement

The research is supported by “ the Fundamental Research Funds for the Central Universities ” under Grant Agreement No. N2105009, National Natural Science Foundations of China under Grant Agreement No.61703088.

Citation

If you use our code or models in your research, please cite with:

Yan, Y., Wang, Q., Zhu, H. et al. Iris-LAHNet: a lightweight attention-guided high-resolution network for iris segmentation and localization. Multimedia Systems 30, 85 (2024). https://doi.org/10.1007/s00530-024-01280-5

If you have any questions, please contact us by email.