meituan / YOLOv6

YOLOv6: a single-stage object detection framework dedicated to industrial applications.
GNU General Public License v3.0
5.72k stars 1.04k forks source link

EOFError: Ran out of input #209

Closed Wisdom2wisdom closed 2 years ago

Wisdom2wisdom commented 2 years ago

C:\D_installation_packet\Anaconda\installion_package\envs\yolov6\python.exe "E:/executable_code/YOLO/yoloV62022/YOLOv6-main _7_1/YOLOv6-main/tools/train.py" training args are: Namespace(batch_size=4, check_images=False, check_labels=False, conf_file='../configs/yolov6s_finetune.py', data_path='../data/my_data.yaml', device='0', dist_url='env://', epochs=50, eval_final_only=False, eval_interval=20, gpu_count=0, heavy_eval_range=50, img_size=640, local_rank=-1, name='exp', output_dir='./runs/train', rank=-1, resume=None, workers=2, world_size=1)

Using 1 GPU for training... Train: Final numbers of valid images: 52/ labels: 52. 0.0s for dataset initialization. Convert to COCO format 100%|██████████| 36/36 [00:00<00:00, 36002.61it/s] Convert to COCO format finished. Resutls saved in E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\data\mydata\yolo\annotations\instances_val.json Val: Final numbers of valid images: 36/ labels: 36. 0.1s for dataset initialization. Loading state_dict from E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\weights/yolov6s.pt for fine-tuning... Traceback (most recent call last): File "E:/executable_code/YOLO/yoloV62022/YOLOv6-main _7_1/YOLOv6-main/tools/train.py", line 94, in main(args) File "E:/executable_code/YOLO/yoloV62022/YOLOv6-main _7_1/YOLOv6-main/tools/train.py", line 83, in main trainer = Trainer(args, cfg, device) File "E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\yolov6\core\engine.py", line 42, in init model = self.get_model(args, cfg, self.num_classes, device) File "E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\yolov6\core\engine.py", line 247, in get_model model = load_state_dict(weights, model, map_location=device) File "E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\yolov6\utils\checkpoint.py", line 13, in load_state_dict ckpt = torch.load(weights, map_location=map_location) File "C:\D_installation_packet\Anaconda\installion_package\envs\yolov6\lib\site-packages\torch\serialization.py", line 593, in load return _legacy_load(opened_file, map_location, pickle_module, pickle_load_args) File "C:\D_installation_packet\Anaconda\installion_package\envs\yolov6\lib\site-packages\torch\serialization.py", line 762, in _legacy_load magic_number = pickle_module.load(f, pickle_load_args) EOFError: Ran out of input

Wisdom2wisdom commented 2 years ago

can you help me?

mtjhl commented 2 years ago

Hi, in your weights path E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\weights/yolov6s.pt, maybe you can set the last /yolov6s.pt to \yolov6s.pt and try to run the code again.

Wisdom2wisdom commented 2 years ago

Hi, in your weights path E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\weights/yolov6s.pt, maybe you can set the last /yolov6s.pt to \yolov6s.pt and try to run the code again.

I have changed the /yolo6s.pt, but it is still the same error, is there any other way?

Wisdom2wisdom commented 2 years ago

thanks,I have changed the /yolo6s.pt, but it is still the same error, is there any other way?

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年7月3日(星期天) 中午1:30 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [meituan/YOLOv6] EOFError: Ran out of input (Issue #209)

Hi, in your weights path E:\executable_code\YOLO\yoloV62022\YOLOv6-main _7_1\YOLOv6-main\weights/yolov6s.pt, maybe you can set the last /yolov6s.pt to \yolov6s.pt and try to run the code again.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

mtjhl commented 2 years ago

How about change the path to E:\\executable_code\\YOLO\\yoloV62022\\YOLOv6-main _7_1\\YOLOv6-main\\weights\\yolov6s.pt in windows? If set the pretrained path to None, can you train normally?