mkang315 / BGF-YOLO

[MICCAI'24] Official implementation of "BGF-YOLO: Enhanced YOLOv8 with Multiscale Attentional Feature Fusion for Brain Tumor Detection".
GNU Affero General Public License v3.0
77 stars 12 forks source link
afpn bi-level-routing-attention biformer bifpn br35h-brain-tumor-detection-2020 cbam ciou-loss computer-vision-algorithms deep-learning-framework deep-neural-networks gfpn giou-loss lesion-detection medical-image-analysis medical-image-computing medical-image-detection medical-image-processing object-detection wiou-loss yolo

Official BGF-YOLO

This is the source code for the paper, "BGF-YOLO: Enhanced YOLOv8 with Multiscale Attentional Feature Fusion for Brain Tumor Detection", early accepted by the 27th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2024), of which I am the first author. The paper is available to download from Springer or arXiv.

Model

The Bilevel routing attention, Generalized feature pyramid networks, and Fourth detecting head You Only Look Once (BGF-YOLO) model configuration (i.e., network construction) file is bgf-yolo.yaml in the directory ./models/bgf.

The hyperparameter setting file is default.yaml in the directory ./yolo/cfg/.

Installation

Install requirements.txt in a Python >= 3.8 environment, including PyTorch >= 1.8.

pip install -r requirements.txt

Training CLI

python yolo/bgf/detect/train.py

Testing CLI

python yolo/bgf/detect/predict.py

Evaluation

We trained and evaluated BGF-YOLO on the dataset Br35H :: Brain Tumor Detection 2020. The .txt format annotations in the file dataset-Br35H.zip are coverted from original json format.

Referencing Guide

Please cite our paper if you use code from this repository. Here is a guide to referencing this work in various styles for formatting your references:

Plain Text

BibTeX Format

\begin{thebibliography}{1}
\bibitem{Kang24Bgfyolo} Kang, M., Ting, C.-M., Ting, F.F., Phan, R.C.-W.: BGF-YOLO: enhanced YOLOv8 with multiscale attentional feature fusion for brain tumor detection. In: Linguraru, M.G., et al. (eds.) MICCAI 2024. LNCS, vol. 15008, 35--45. Springer, Cham (2024). {\UrlFont https://doi.org/10.1007/978-3-031-72111-3\_4}
\end{thebibliography}
@inproceedings{Kang24Bgfyolo,
author = "Kang, Ming and Ting, Chee-Ming and Ting, Fung Fung and Phan, Rapha{\"e}l C.-W.",
title = "BGF-YOLO: enhanced YOLOv8 with multiscale attentional feature fusion for brain tumor detection",
editor = "Linguraru, Marius George and et al.",
booktitle = "Medical Image Computing and Computer-Assisted Intervention – MICCAI 2024: 27th International Conference, Marrakesh, Morocco, October 6--10, 2024, Proceedings, Part VIII",
series = "Lecture Notes in Computer Science (LNCS)",
volume = "15008",
pages = "35--45",
publisher = "Springer",
address = "Cham",
year = "2024",
doi= "10.1007/978-3-031-72111-3_4",
url = "https://doi.org/10.1007/978-3-031-72111-3_4"
}
@inproceedings{Kang24Bgfyolo,
author = "Ming Kang and Chee-Ming Ting and Fung Fung Ting and Rapha{\"e}l C.-W. Phan",
title = "Bgf-yolo: Enhanced yolov8 with multiscale attentional feature fusion for brain tumor detection",
booktitle = "Proc. Int. Conf. Med. Image Comput. Comput. Assist. Interv. (MICCAI)",
address = "Marrakesh, Morocco, Oct. 6--10",
pages = "35--45",
year = "2024",
}

NOTE: Please remove some optional BibTeX fields, for example, series, volume, address, url and so on, while the LaTeX compiler produces an error. Author names may be manually modified if not automatically abbreviated by the compiler under the control of the .bst file if applicable which defines bibliography/reference style. Kang24Bgfyolo could be b1, bib1, or ref1 when references appear in numbered style in which they are cited. The quotation mark pair "" in the field could be replaced by the brace {}.

License

BGF-YOLO is released under the GNU Affero General Public License v3.0 (AGPL-3.0). Please see the LICENSE file for more information.

Copyright Notice

Many utility codes of our project base on the codes of Ultralytics YOLOv8, GiraffeDet, DAMO-YOLO, and BiFormer repositories.