zxytim / fast-gmm

a Gaussian Mixture Model (GMM) implementation.
MIT License
30 stars 8 forks source link

sorry can I ask some questions about east training here ? #2

Closed blankWorld closed 6 years ago

blankWorld commented 7 years ago

hi, have you tried pvanet as basenetwork? I tried pvanet using caffe but encountered overfitting problem. my training sets is 950 images from icdar 2015 trainningsets( the other 50 images as validation sets) and 229 images from icdar 2013. model is trained by online data augmentation which includes scaling and rotations between ±30°. iou loss overfits a lot that when trainning iou descend to 0.25 validation iou loss still stays high at 0.7. I think I have confirmed everything so much that I can not solve this problem. please help me, Mr. tim!!!!!!. I have cost two month on this problem, all my thanks for you!!

zxytim commented 7 years ago

Hi World, you'd better submit the issue at https://github.com/argman/EAST but not this in this "fast-gmm" repo. Nonetheless, you can try playing around with weight decay, e.g., add 1e-6 weight decay while training. If not working, use more weight decay, like 1e-5.

On Sat, Sep 16, 2017 at 7:47 AM world notifications@github.com wrote:

hi, have you tried pvanet as basenetwork? I tried pvanet using caffe but encountered overfitting problem. my training sets is 950 images from icdar 2015 trainningsets( the other 50 images as validation sets) and 229 images from icdar 2013. model is trained by online data augmentation which includes scaling and rotations between ±30°. iou loss overfits a lot that when trainning iou descend to 0.25 validation iou loss still stays high at 0.7. I think I have confirmed everything so much that I can not solve this problem. please help me, Mr. tim!!!!!!. I have cost two month on this problem, all my thanks for you!!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/zxytim/fast-gmm/issues/2, or mute the thread https://github.com/notifications/unsubscribe-auth/AA2QT_6ArxFgYPaxy3nglPcSs17eO3bZks5siwyDgaJpZM4PZn_v .

blankWorld commented 7 years ago

OK, 3q~~ I will try more weight decay in my training and ..... last question, do you use 'not care' texts in your training

zxytim commented 7 years ago

No. The loss regarding "do not care" areas are muted.

BTW, only Chinese can understand "3q"

On Sun, Sep 17, 2017 at 8:23 AM world notifications@github.com wrote:

OK, 3q~~ I will try more weight decay in my training and ..... last question, do you use 'not care' texts in your training

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/zxytim/fast-gmm/issues/2#issuecomment-330003410, or mute the thread https://github.com/notifications/unsubscribe-auth/AA2QTxt-F5CT_lT1RuNp5wz4lJSL6ZCKks5sjGZvgaJpZM4PZn_v .

blankWorld commented 7 years ago

haha yes 看了你的文章 感触颇多 这么简单的pipeline也能有如此的效果 准备稍微深入一下 顺便 祝你事业顺利