MichalBusta / E2E-MLT

E2E-MLT - an Unconstrained End-to-End Method for Multi-Language Scene Text
MIT License
291 stars 84 forks source link

loss always -2.00 #18

Closed WasFly closed 5 years ago

WasFly commented 5 years ago

When I train my own dataset, the values of ·loss· and seg_loss· are always equal -2.00. Other values are 0. Is there a problem? , I have not changed anything ,excep data set and batch_size(32), and train start from scratch

MichalBusta commented 5 years ago

one of the most parameters is input_size (size of the crops fed to network) - bigger=better.

 - you can use debug option of the script to see the data you are feeding. (the situation (probably) is, that you are feeding "empty data"

On 19/01/2019 17:18, wasfly wrote:

When I train my own dataset, the values of ·loss· and seg_loss· are always equal -2.00. Other values are 0. Is there a problem? , I have not changed anything ,excep data set and batch_size(32), and train start from scratch

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/MichalBusta/E2E-MLT/issues/18, or mute the thread https://github.com/notifications/unsubscribe-auth/AD6jsAQabx3p7OMawGzbE4EVTtVd0SzZks5vE0VRgaJpZM4aJNXW.

WasFly commented 5 years ago

Yes, my problem, sorry.