Tramac / awesome-semantic-segmentation-pytorch

Semantic Segmentation on PyTorch (include FCN, PSPNet, Deeplabv3, Deeplabv3+, DANet, DenseASPP, BiSeNet, EncNet, DUNet, ICNet, ENet, OCNet, CCNet, PSANet, CGNet, ESPNet, LEDNet, DFANet)
Apache License 2.0
2.85k stars 582 forks source link

大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 #105

Open anqin5211314 opened 4 years ago

anqin5211314 commented 4 years ago

2019-11-20 11:16:10,422 semantic_segmentation INFO: Iters: 880/139700 || Lr: 0.000099 || Loss: 0.0033 || Cost Time: 0:06:46 || Estimated Time: 17:46:33 2019-11-20 11:16:15,154 semantic_segmentation INFO: Iters: 890/139700 || Lr: 0.000099 || Loss: 0.0026 || Cost Time: 0:06:50 || Estimated Time: 17:46:48 2019-11-20 11:16:19,786 semantic_segmentation INFO: Iters: 900/139700 || Lr: 0.000099 || Loss: 0.0047 || Cost Time: 0:06:55 || Estimated Time: 17:46:44 2019-11-20 11:16:24,414 semantic_segmentation INFO: Iters: 910/139700 || Lr: 0.000099 || Loss: 0.0116 || Cost Time: 0:07:00 || Estimated Time: 17:46:44 2019-11-20 11:16:49,455 semantic_segmentation INFO: Start validation, Total sample: 114 2019-11-20 11:16:58,204 semantic_segmentation INFO: Sample: 1, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,222 semantic_segmentation INFO: Sample: 2, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,642 semantic_segmentation INFO: Sample: 3, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,219 semantic_segmentation INFO: Sample: 4, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,779 semantic_segmentation INFO: Sample: 5, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,346 semantic_segmentation INFO: Sample: 6, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,908 semantic_segmentation INFO: Sample: 7, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:07,470 semantic_segmentation INFO: Sample: 8, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,031 semantic_segmentation INFO: Sample: 9, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,584 semantic_segmentation INFO: Sample: 10, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,134 semantic_segmentation INFO: Sample: 11, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,682 semantic_segmentation INFO: Sample: 12, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,076 semantic_segmentation INFO: Sample: 13, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,621 semantic_segmentation INFO: Sample: 14, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,180 semantic_segmentation INFO: Sample: 15, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,576 semantic_segmentation INFO: Sample: 16, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,127 semantic_segmentation INFO: Sample: 17, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,690 semantic_segmentation INFO: Sample: 18, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,259 semantic_segmentation INFO: Sample: 19, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,659 semantic_segmentation INFO: Sample: 20, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,219 semantic_segmentation INFO: Sample: 21, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,797 semantic_segmentation INFO: Sample: 22, validation pixAcc: 100.000, mIoU: 50.000

anqin5211314 commented 4 years ago

@Tramac 厉害的作者 请知名道路谢谢

Tramac commented 4 years ago

Maybe there is something wrong with the label.

swjtulinxi commented 4 years ago

Hi,if i want to use my own dataset,which dataloader.py should i use,please tell me when you saw the question.

Tramac commented 4 years ago

Hi,if i want to use my own dataset,which dataloader.py should i use,please tell me when you saw the question.

It depends on your dataset directory structure.

swjtulinxi commented 4 years ago

your code is designed for dataset like cityscape,voc 2012 and so on,but i want to us my own dataset, it contains train label img val label img test img label

Tramac commented 4 years ago

You can refer to the script cityscapes.py, it is similar to your file structure.

swjtulinxi commented 4 years ago

from core.nn import _C

ImportError: cannot import name '_C',there is no _c ine the files

swjtulinxi commented 4 years ago

hi,according to your results,why is it lower than the paper's results,do you have any answer?

YangYangGirl commented 4 years ago

I meet the same problem, could you tell me the way you solved the problem? Thanks.

Kittywyk commented 4 years ago

2019-11-20 11:16:10,422 semantic_segmentation INFO: Iters: 880/139700 || Lr: 0.000099 || Loss: 0.0033 || Cost Time: 0:06:46 || Estimated Time: 17:46:33 2019-11-20 11:16:15,154 semantic_segmentation INFO: Iters: 890/139700 || Lr: 0.000099 || Loss: 0.0026 || Cost Time: 0:06:50 || Estimated Time: 17:46:48 2019-11-20 11:16:19,786 semantic_segmentation INFO: Iters: 900/139700 || Lr: 0.000099 || Loss: 0.0047 || Cost Time: 0:06:55 || Estimated Time: 17:46:44 2019-11-20 11:16:24,414 semantic_segmentation INFO: Iters: 910/139700 || Lr: 0.000099 || Loss: 0.0116 || Cost Time: 0:07:00 || Estimated Time: 17:46:44 2019-11-20 11:16:49,455 semantic_segmentation INFO: Start validation, Total sample: 114 2019-11-20 11:16:58,204 semantic_segmentation INFO: Sample: 1, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,222 semantic_segmentation INFO: Sample: 2, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,642 semantic_segmentation INFO: Sample: 3, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,219 semantic_segmentation INFO: Sample: 4, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,779 semantic_segmentation INFO: Sample: 5, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,346 semantic_segmentation INFO: Sample: 6, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,908 semantic_segmentation INFO: Sample: 7, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:07,470 semantic_segmentation INFO: Sample: 8, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,031 semantic_segmentation INFO: Sample: 9, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,584 semantic_segmentation INFO: Sample: 10, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,134 semantic_segmentation INFO: Sample: 11, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,682 semantic_segmentation INFO: Sample: 12, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,076 semantic_segmentation INFO: Sample: 13, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,621 semantic_segmentation INFO: Sample: 14, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,180 semantic_segmentation INFO: Sample: 15, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,576 semantic_segmentation INFO: Sample: 16, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,127 semantic_segmentation INFO: Sample: 17, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,690 semantic_segmentation INFO: Sample: 18, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,259 semantic_segmentation INFO: Sample: 19, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,659 semantic_segmentation INFO: Sample: 20, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,219 semantic_segmentation INFO: Sample: 21, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,797 semantic_segmentation INFO: Sample: 22, validation pixAcc: 100.000, mIoU: 50.000

Hi, I got trouble when trying to train on my own dataset... I've already written a dataloader file and renamed it to mydata.py for example. I also added the name "mydata" in code as an argument. But when I ran the command " python train.py --model fcn32s --backbone vgg16 --dataset mydata --lr 0.01 --epochs 50" ,the error occured: train.py: error: argument --dataset: invalid choice: 'mydata' (choose from 'pascal_voc', 'pascal_aug', 'ade20k', 'citys', 'sbu') Do u know what should I modify then? Thanks a million for ur help!

Tramac commented 4 years ago

Please check your dataloader.

Kittywyk commented 4 years ago

Sorry to bother u again. But I felt there's something wrong in adding the choice, cuz the error shows that there's no "mydata" dataset choice. I wonder except adding argument in function parse_args() of train.py, where alse can I modify? Sincerely looking forward to ur reply, thanks! 

------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <notifications@github.com>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<awesome-semantic-segmentation-pytorch@noreply.github.com>; 抄送: "KunyuanW"<987663694@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105)

Please check your dataloader.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

Kittywyk commented 4 years ago

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx!

------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <notifications@github.com>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<awesome-semantic-segmentation-pytorch@noreply.github.com>; 抄送: "KunyuanW"<987663694@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105)

Please check your dataloader.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

yao123yuhui commented 2 years ago

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx! ------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <notifications@github.com>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<awesome-semantic-segmentation-pytorch@noreply.github.com>; 抄送: "KunyuanW"<987663694@qq.com>;"Comment"<comment@noreply.github.com>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105) Please check your dataloader. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

please ,how do you solve the problem

Kittywyk commented 2 years ago

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx! ------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" [notifications@github.com](mailto:notifications@github.com); 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"[awesome-semantic-segmentation-pytorch@noreply.github.com](mailto:awesome-semantic-segmentation-pytorch@noreply.github.com); 抄送: "KunyuanW"[987663694@qq.com](mailto:987663694@qq.com);"Comment"[comment@noreply.github.com](mailto:comment@noreply.github.com); 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105) Please check your dataloader. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

please ,how do you solve the problem

Sorry, I forgot it a bit cuz I didn't use this project anymore. But if I recall correctly, I added my customized class into dataloader/init.py file and it worked. Best.