ultralytics / yolov5

YOLOv5 πŸš€ in PyTorch > ONNX > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
50.08k stars 16.18k forks source link

About val the data #5463

Closed 523997931 closed 2 years ago

523997931 commented 2 years ago

Search before asking

Question

Hi, I want to know, if I could use the val scripts just for some classes, for example, I want to val the pretrained model, but I only care about the mAP of person or car.

Additional

No response

github-actions[bot] commented 2 years ago

πŸ‘‹ Hello @523997931, thank you for your interest in YOLOv5 πŸš€! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution.

If this is a πŸ› Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available.

For business inquiries or professional support requests please visit https://ultralytics.com or email Glenn Jocher at glenn.jocher@ultralytics.com.

Requirements

Python>=3.6.0 with all requirements.txt installed including PyTorch>=1.7. To get started:

$ git clone https://github.com/ultralytics/yolov5
$ cd yolov5
$ pip install -r requirements.txt

Environments

YOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

CI CPU testing

If this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), validation (val.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit.

glenn-jocher commented 2 years ago

@523997931 python val.py --verbose will display per-class metrics.

523997931 commented 2 years ago

@523997931 python val.py --verbose will display per-class metrics.

Thank you for your reply!

ZhouBay-TF commented 2 years ago

@glenn-jocher ,hello,I want to get the value of "P R mAP@.5 mAP@β€οΌŒ when i use "python val.py --verbose",i get all val is 0; image

glenn-jocher commented 2 years ago

@xinkangzhou sure, then your mAP is zero.

github-actions[bot] commented 2 years ago

πŸ‘‹ Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.

Access additional YOLOv5 πŸš€ resources:

Access additional Ultralytics ⚑ resources:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLOv5 πŸš€ and Vision AI ⭐!

Symbadian commented 2 years ago

val.py --verbose

Hi @glenn-jocher,

Forgive my lack of ignorance but I keep getting this error when I try visualizing the new classes that were added in the training when running the validation script as per your direction python val.py --verbose.

None of the new classes are being shown in the validation verbose stage only the 80 classes per the original dataset. Based on all of the examples provided, I followed such to the best of my knowledge and I cannot seem to get this understanding.

Can you help me analyze this, please? I modified all configurations based on my knowledge but this keeps on popping up, please help!! and I am not sure what this means no matter how much I google??!!??!?

` (base) Symbadian-MacBook-Pro:yolov5 symbadian$ python3 val.py --verbose val: data=data/coco128.yaml, weights=yolov5s.pt, batch_size=32, imgsz=640, conf_thres=0.001, iou_thres=0.6, task=val, device=, workers=8, single_cls=False, augment=False, verbose=True, save_txt=False, save_hybrid=False, save_conf=False, save_json=False, project=runs/val, name=exp, exist_ok=False, half=False, dnn=False YOLOv5 πŸš€ 2022-5-7 torch 1.11.0 CPU

Fusing layers... YOLOv5s summary: 213 layers, 7225885 parameters, 0 gradients This is data in the val.py file {'path': '../datasets/coco128', 'train': ['/Users/symbadian/Desktop/LIRIS_new_dataset/datasets/coco128/images/train', '/Users/symbadian/Desktop/LIRIS_new_dataset/datasets/coco128/images/train2017'], 'val': ['/Users/symbadian/Desktop/LIRIS_new_dataset/datasets/coco128/images/val', '/Users/symbadian/Desktop/LIRIS_new_dataset/datasets/coco128/images/train2017'], 'test': None, 'nc': 106, 'names': ['person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', 'baseball bat', 'baseball glove', 'skateboard', 'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', 'hair drier', 'toothbrush', 'Beating', 'CuttingInKitchen', 'Discus_Amgst_ppl', 'Discuss_Wr_OnBrd', 'DiscusW_Give_Item', 'Enter_Room', 'Fencing', 'Fighting', 'Gun_wp_Deploy', 'Hand_Shake', 'Knife_Deploy', 'Leaves_Room', 'Make_Answer_Tel', 'Nunchucks', 'Pick_Up_Object', 'Put_Down_Object', 'Put_Item_In_Draw', 'Reading', 'Shooting', 'Stabbing', 'SumoWrestling', 'Typing_On_KeyBrd', 'Unlock_Dr_Leaves', 'Unlock_Rm_Enters', 'Unlock_Rm_Unsucc', 'Walking']} This is nc from val.py file 106 This is ncm from val.py file 80 Traceback (most recent call last): File "/Users/symbadian/Desktop/LIRIS_new_dataset/yolov5/val.py", line 399, in main(opt) File "/Users/symbadian/Desktop/LIRIS_new_dataset/yolov5/val.py", line 372, in main run(*vars(opt)) File "/Users/symbadian/miniforge3/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(args, **kwargs) File "/Users/symbadian/Desktop/LIRIS_new_dataset/yolov5/val.py", line 170, in run assert ncm == nc, f'{weights[0]} ({ncm} classes) trained on different --data than what you passed ({nc} ' \ TypeError: 'PosixPath' object is not subscriptable`

glenn-jocher commented 2 years ago

@Symbadian πŸ‘‹ hi, thanks for letting us know about this possible problem with YOLOv5 πŸš€. We've created a few short guidelines below to help users provide what we need in order to start investigating a possible problem.

How to create a Minimal, Reproducible Example

When asking a question, people will be better able to provide help if you provide code that they can easily understand and use to reproduce the problem. This is referred to by community members as creating a minimum reproducible example. Your code that reproduces the problem should be:

For Ultralytics to provide assistance your code should also be:

If you believe your problem meets all the above criteria, please close this issue and raise a new one using the πŸ› Bug Report template with a minimum reproducible example to help us better understand and diagnose your problem.

Thank you! πŸ˜ƒ