ultralytics / yolov5

YOLOv5 πŸš€ in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
49.66k stars 16.11k forks source link

Evolved anchors return the same default anchors size #3051

Closed kimsung-git closed 3 years ago

kimsung-git commented 3 years ago

❔Question

As far as I understood, autoanchor evolves and finds the most optimal anchors size from customized dataset so we don't have to modify the anchor size value by hand. However, after training from scratch with customized dataset, anchors size seem the same as what is defined in yolov5l.yaml.

This is the command I used for training

nohup python -m torch.distributed.launch --nproc_per_node 4 \
    train.py \
    --batch-size 64 \
    --img 640 \
    --epochs 500 \
    --data airport.yaml \
    --project airport \
    --weights '' \
    --cfg models/yolov5l.yaml  & 

Also this is command I used to check anchors size

model = torch.load(opt.weight, map_location=None)  # load
    m = model['model'].model[-1]  # Detect() layer
    print(m.anchor_grid.squeeze())  # print anchors

And this is the size of anchors

tensor([[[ 10.,  13.],
         [ 16.,  30.],
         [ 33.,  23.]],

        [[ 30.,  61.],
         [ 62.,  45.],
         [ 59., 119.]],

        [[116.,  90.],
         [156., 198.],
         [373., 326.]]], device='cuda:0', dtype=torch.float16)

The size of anchor didn't change but same as the default values which is from COCO dataset. Does this mean autoanchor didn't work properly?

Additional context

Thanks for your contribution to this work.

github-actions[bot] commented 3 years ago

πŸ‘‹ Hello @kimsung-git, thank you for your interest in πŸš€ YOLOv5! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution.

If this is a πŸ› Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available.

For business inquiries or professional support requests please visit https://www.ultralytics.com or email Glenn Jocher at glenn.jocher@ultralytics.com.

Requirements

Python 3.8 or later with all requirements.txt dependencies installed, including torch>=1.7. To install run:

$ pip install -r requirements.txt

Environments

YOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

CI CPU testing

If this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), testing (test.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit.

kimsung-git commented 3 years ago

After reading over hyperparameter tuning tutorial, it seems that I need to uncomment anchors parameter at hyp.scratch.yaml for autoanchor to overwrite anchor information in yolov5l.yaml(actually to activate autoanchor properly). So the reason why I had the same anchors size was because the program just read anchor information from yolov5l.yaml and didn't really run autoanchor.

I get different autoanchor log with or without anchors parameter at hyp.scratch.yaml

with anchors parameter

autoanchor: Analyzing anchors... anchors/target = 0.32, Best Possible Recall (BPR) = 0.0839. Attempting to improve anchors, please wait...
autoanchor: WARNING: Extremely small objects found. 1302 of 490147 labels are < 3 pixels in size.
autoanchor: Running kmeans for 9 anchors on 490138 points...
autoanchor: thr=0.25: 0.9866 best possible recall, 4.08 anchors past thr
autoanchor: n=9, img_size=640, metric_all=0.279/0.665-mean/best, past_thr=0.473-mean: 17,22,  44,54,  55,120,  139,93,  102,213,  159,345,  290,208,  296,491,  527,350
autoanchor: Evolving anchors with Genetic Algorithm: fitness = 0.7008: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1000/1000 [00:32<00:00, 30.69it/s]
autoanchor: thr=0.25: 0.9976 best possible recall, 4.34 anchors past thr
autoanchor: n=9, img_size=640, metric_all=0.292/0.701-mean/best, past_thr=0.473-mean: 10,15,  25,24,  22,55,  60,46,  50,118,  123,111,  92,228,  184,314,  422,370
autoanchor: New anchors saved to model. Update model *.yaml to use these anchors in the future.

without anchors parameter

autoanchor: Analyzing anchors... anchors/target = 4.44, Best Possible Recall (BPR) = 0.9980

So the solution to my question is to make sure autoanchor run by uncommenting anchors parameter in hyp.scratch.yaml?

glenn-jocher commented 3 years ago

@kimsung-git autoanchor analyzes your anchors in the context of your dataset and training settings, and will evolve you new anchors only if your current anchors are missing or they are determined to be a poor fit based on the BPR metric.

glenn-jocher commented 3 years ago

@kimsung-git also if autoanchor completes with no error then it is working correctly.

kimsung-git commented 3 years ago

@glenn-jocher what do you mean by β€œif you’re current anchors are missing?” You mean when anchors parameter is not set in hyp.scratch.yaml? Thanks for the help!

glenn-jocher commented 3 years ago

@kimsung-git anchor definition is optional. Model yaml files support either explicit anchor definition or specifying an anchor count per layer. If your hyp file defines an anchor count it will take precedence over your model yaml anchors.

anchors: 3
github-actions[bot] commented 3 years ago

πŸ‘‹ Hello @glenn-jocher, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.

Access additional YOLOv5 πŸš€ resources:

Access additional Ultralytics ⚑ resources:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLOv5 πŸš€ and Vision AI ⭐!