liuwei16 / CSP

High-level Semantic Feature Detection: A New Perspective for Pedestrian Detection, CVPR, 2019
753 stars 191 forks source link

The losses value from train_Caltech are higher than authors #102

Closed T109318049 closed 3 years ago

T109318049 commented 3 years ago

When I train the Caltech code from https://github.com/dominikandreas/CSP, the records show the loss value is very high. I think that the value is the abnormal situation from cls_center, shown as below. But I have no idea to solve this issue. The records from train_Caltech are shown as below: "total loss" "cls" "regr_h" "offset" 0.146088 0.115322 0.009933 0.020833 0.098287 0.087741 0.002239 0.008307 0.071353 0.062760 0.001668 0.006925 0.081938 0.073539 0.001938 0.006461 0.073801 0.065529 0.001339 0.006932 0.072089 0.064259 0.001209 0.006621 0.067687 0.059619 0.001296 0.006772 0.064946 0.057368 0.001091 0.006488 0.061354 0.054006 0.001062 0.006286 0.061342 0.053803 0.001177 0.006362 0.061981 0.054710 0.001086 0.006185 0.058011 0.050864 0.001055 0.006091 0.055380 0.048392 0.001013 0.005975 0.059276 0.052155 0.001003 0.006118 0.051897 0.045055 0.000858 0.005984 0.052011 0.045238 0.000839 0.005935 0.055235 0.048431 0.000995 0.005809 0.053121 0.046175 0.000810 0.006136 0.054537 0.047586 0.001055 0.005896 0.050895 0.044234 0.000816 0.005845 0.051259 0.044498 0.000852 0.005909 0.050222 0.043484 0.000812 0.005926 0.049202 0.042674 0.000820 0.005707 0.047568 0.041040 0.000794 0.005733 0.045857 0.039506 0.000708 0.005643 0.049599 0.043301 0.000772 0.005527 0.046297 0.040108 0.000747 0.005442 0.045776 0.039386 0.000682 0.005708 0.047359 0.041113 0.000751 0.005495 0.044618 0.038325 0.000724 0.005569 0.044197 0.037804 0.000705 0.005688 0.043063 0.036738 0.000680 0.005645 0.044264 0.038139 0.000707 0.005418 0.041878 0.035751 0.000642 0.005485 0.044452 0.038305 0.000699 0.005448 0.042483 0.036077 0.000710 0.005695 0.044659 0.038537 0.000726 0.005396 0.041108 0.035030 0.000636 0.005442 0.040449 0.034342 0.000665 0.005442 0.040523 0.034651 0.000586 0.005286 0.040542 0.034645 0.000655 0.005242 0.039925 0.033994 0.000614 0.005317 0.040263 0.034208 0.000649 0.005406 0.039177 0.033443 0.000596 0.005138 0.037185 0.031399 0.000609 0.005177 0.041520 0.035570 0.000676 0.005274 0.039260 0.033446 0.000643 0.005171 0.038458 0.032674 0.000604 0.005180 0.038988 0.033193 0.000586 0.005209 0.038271 0.032380 0.000574 0.005317 0.037493 0.031569 0.000649 0.005276 0.038233 0.032616 0.000627 0.004990 0.037209 0.031312 0.000611 0.005286 0.038365 0.032619 0.000623 0.005123 0.036683 0.030863 0.000534 0.005286 0.036462 0.030927 0.000547 0.004988 0.035886 0.030214 0.000565 0.005108 0.037315 0.031443 0.000540 0.005332 0.035665 0.029956 0.000580 0.005128 0.036883 0.031298 0.000569 0.005016 0.036175 0.030355 0.000578 0.005241 0.034284 0.028385 0.000520 0.005379 0.034460 0.028736 0.000503 0.005221 0.034561 0.028825 0.000544 0.005192 0.035108 0.029526 0.000535 0.005047 0.033300 0.027814 0.000516 0.004970 0.035685 0.030010 0.000626 0.005049 0.034709 0.029089 0.000552 0.005067 0.032829 0.027182 0.000552 0.005095 0.034950 0.029222 0.000549 0.005179 0.035355 0.029635 0.000521 0.005199 0.033097 0.027575 0.000506 0.005016 0.034409 0.028776 0.000539 0.005094 0.034374 0.028723 0.000560 0.005091 0.033462 0.028019 0.000502 0.004941 0.031856 0.026466 0.000501 0.004889 0.034407 0.028888 0.000542 0.004977 0.033637 0.028005 0.000525 0.005106 0.032011 0.026455 0.000506 0.005049 0.032594 0.027016 0.000507 0.005072 0.031713 0.026079 0.000506 0.005128 0.032270 0.026873 0.000550 0.004847 0.033212 0.027798 0.000508 0.004907 0.033275 0.027806 0.000561 0.004908 0.031593 0.026026 0.000524 0.005043 0.033019 0.027241 0.000543 0.005235 0.031242 0.025833 0.000487 0.004922 0.033085 0.027534 0.000545 0.005007 0.032539 0.027065 0.000514 0.004960 0.031214 0.025757 0.000496 0.004960 0.031400 0.026045 0.000475 0.004880 0.030927 0.025401 0.000532 0.004995 0.031312 0.025998 0.000474 0.004839 0.031789 0.026305 0.000497 0.004988 0.031681 0.026299 0.000518 0.004863 0.029155 0.023938 0.000507 0.004711 0.028076 0.022962 0.000447 0.004667 0.030467 0.025188 0.000475 0.004803 0.031895 0.026434 0.000499 0.004961 0.030741 0.025282 0.000505 0.004954 0.030523 0.025147 0.000490 0.004887 0.031197 0.025962 0.000495 0.004740 0.028096 0.022867 0.000463 0.004767 0.029270 0.023728 0.000469 0.005072 0.030056 0.024727 0.000470 0.004859 0.029694 0.024355 0.000457 0.004883 0.029383 0.024017 0.000482 0.004884 0.029807 0.024387 0.000506 0.004914 0.030971 0.025645 0.000495 0.004831 0.031964 0.026566 0.000506 0.004892 0.030898 0.025541 0.000504 0.004853 0.029556 0.024205 0.000474 0.004878 0.028333 0.023096 0.000453 0.004784 0.028542 0.023254 0.000438 0.004850 0.028839 0.023586 0.000491 0.004762 0.029530 0.024289 0.000471 0.004770 0.030640 0.025408 0.000454 0.004779 0.030001 0.024737 0.000458 0.004806 0.029234 0.023925 0.000518 0.004791 0.028570 0.023454 0.000488 0.004627

Here is the version from the modules: Python:3.6 Keras:2.0.8 Tensorflow: 1.14.0 py-OpenCV: 3.4.2

If anyone knows what the reason is, please answer me, thanks!