Open marvision-ai opened 4 years ago
I have a similar issue actually. I'm using a YOLOv4-tiny and all my calculated anchors are between 200-350. My objects will always be the same size and I'm not applying any rescaling augmentation. What would be the best practice for the mask indices? Because in the case of YOLOv4-tiny, I wouldn't have any anchors for my last YOLO layer (as there are none below 60). @AlexeyAB do you have any advice on this? Thanks in advance!
I also have a similar issue but for yolov4 having all layers
I have calculated anchor boxes for my custom dataset and got the following anchor boxes: 109, 9, 66, 17, 168, 7, 179, 11, 129, 25, 290, 21, 219, 75, 539, 49, 546,214
Alexey suggests the following thing for anchors:
Only if you are an expert in neural detection networks - recalculate anchors for your dataset for width and height from cfg-file: darknet.exe detector calc_anchors data/obj.data -num_of_clusters 9 -width 416 -height 416 then set the same 9 anchors in each of 3 [yolo]-layers in your cfg-file. But you should change indexes of anchors masks= for each [yolo]-layer, so for YOLOv4 the 1st-[yolo]-layer has anchors smaller than 30x30, 2nd smaller than 60x60, 3rd remaining, and vice versa for YOLOv3. Also, you should change the filters=(classes + 5)*
I do not get the point "so for YOLOv4 the 1st-[yolo]-layer has anchors smaller than 30x30, 2nd smaller than 60x60, 3rd remaining, and vice versa for YOLOv3."
How should I make changes in cfg according to anchors boxes I have generated in all three yolo layers in mask and anchors variable of cfg?
How does one input custom anchors that do not follow the same trend as the original anchors (x<30, 30>x<60, and x>60)?
For reference: Img size =
1280x736
Original Anchors :
This leaves you with:
New Anchors:
./darknet detector calc_anchors object.data -num_of_clusters 9 -width 1280 -height 736 -show
new masks in .cfg file?
I am not sure how to do this because everything is above 60x60, and the layers of the 3l Model require x<30, 30>x<60, and x>60
Thank you in advance!