chensong1995 / HybridPose

HybridPose: 6D Object Pose Estimation under Hybrid Representation (CVPR 2020)
MIT License
419 stars 64 forks source link

training hybridpose for a single object from linemod dataset #90

Closed monajalal closed 11 months ago

monajalal commented 11 months ago

Is this command correct to train hybridpose for a single object from linemod dataset, here ape? If not, could you please provide instruction to train hybridpose on a single object from linemod dataset?

I am doubtful since it starts with printing Testing...

(hybridpose) mona@ada:~/HybridPose$ LD_LIBRARY_PATH=lib/regressor:$LD_LIBRARY_PATH python src/train_core.py --object_name ape
number of model parameters: 12959563
Testing...
Loss: 6.4378
Epoch: [0][0/19]    Time: 0.572 (0.572) Sym: 10.7587 (10.7587)  Mask: 0.5649 (0.5649)   Pts: 0.3546 (0.3546)    Graph: 21.1528 (21.1528)    Total: 7.3022 (7.3022)
Epoch: [0][1/19]    Time: 0.183 (0.378) Sym: 10.4726 (10.6156)  Mask: 0.5510 (0.5580)   Pts: 0.3118 (0.3332)    Graph: 20.4260 (20.7894)    Total: 6.7586 (7.0304)
Epoch: [0][2/19]    Time: 0.132 (0.296) Sym: 10.2695 (10.5002)  Mask: 0.5586 (0.5582)   Pts: 0.3548 (0.3404)    Graph: 21.6556 (21.0781)    Total: 7.2992 (7.1200)
Epoch: [0][3/19]    Time: 0.173 (0.265) Sym: 11.5906 (10.7728)  Mask: 0.5551 (0.5574)   Pts: 0.2514 (0.3182)    Graph: 19.7165 (20.7377)    Total: 6.2000 (6.8900)
Epoch: [0][4/19]    Time: 0.222 (0.257) Sym: 7.9189 (10.2020)   Mask: 0.5550 (0.5569)   Pts: 0.2373 (0.3020)    Graph: 18.5854 (20.3073)    Total: 5.5789 (6.6278)
Epoch: [0][5/19]    Time: 0.250 (0.255) Sym: 10.3511 (10.2269)  Mask: 0.5424 (0.5545)   Pts: 0.2365 (0.2911)    Graph: 19.2703 (20.1344)    Total: 5.8692 (6.5013)
Epoch: [0][6/19]    Time: 0.277 (0.259) Sym: 9.6017 (10.1376)   Mask: 0.5385 (0.5522)   Pts: 0.2088 (0.2793)    Graph: 20.8593 (20.2380)    Total: 5.6727 (6.3830)
Epoch: [0][7/19]    Time: 0.225 (0.254) Sym: 12.4763 (10.4299)  Mask: 0.5183 (0.5480)   Pts: 0.2121 (0.2709)    Graph: 23.2340 (20.6125)    Total: 6.2099 (6.3613)
Epoch: [0][8/19]    Time: 0.137 (0.241) Sym: 9.9908 (10.3811)   Mask: 0.5213 (0.5450)   Pts: 0.1938 (0.2623)    Graph: 21.8367 (20.7485)    Total: 5.6416 (6.2814)
Epoch: [0][9/19]    Time: 0.203 (0.237) Sym: 10.1475 (10.3578)  Mask: 0.5050 (0.5410)   Pts: 0.2179 (0.2579)    Graph: 18.4582 (20.5195)    Total: 5.5444 (6.2077)
Epoch: [0][10/19]   Time: 0.267 (0.240) Sym: 10.7277 (10.3914)  Mask: 0.4962 (0.5369)   Pts: 0.2318 (0.2555)    Graph: 20.6083 (20.5276)    Total: 5.9474 (6.1840)
Epoch: [0][11/19]   Time: 0.216 (0.238) Sym: 10.6716 (10.4147)  Mask: 0.4945 (0.5334)   Pts: 0.2079 (0.2515)    Graph: 20.1640 (20.4973)    Total: 5.6572 (6.1401)
Epoch: [0][12/19]   Time: 0.190 (0.234) Sym: 8.2689 (10.2497)   Mask: 0.4896 (0.5300)   Pts: 0.1882 (0.2467)    Graph: 19.4374 (20.4157)    Total: 5.1420 (6.0633)
Epoch: [0][13/19]   Time: 0.155 (0.229) Sym: 11.5563 (10.3430)  Mask: 0.4858 (0.5269)   Pts: 0.2012 (0.2434)    Graph: 21.1966 (20.4715)    Total: 5.7733 (6.0426)
Epoch: [0][14/19]   Time: 0.181 (0.226) Sym: 9.4827 (10.2857)   Mask: 0.4796 (0.5237)   Pts: 0.1893 (0.2398)    Graph: 20.0233 (20.4416)    Total: 5.3236 (5.9947)
Epoch: [0][15/19]   Time: 0.191 (0.223) Sym: 10.3745 (10.2912)  Mask: 0.4747 (0.5207)   Pts: 0.1917 (0.2368)    Graph: 20.5782 (20.4502)    Total: 5.4869 (5.9629)
Epoch: [0][16/19]   Time: 0.158 (0.220) Sym: 10.1667 (10.2839)  Mask: 0.4728 (0.5178)   Pts: 0.1783 (0.2334)    Graph: 22.6935 (20.5821)    Total: 5.5415 (5.9381)
Epoch: [0][17/19]   Time: 0.240 (0.221) Sym: 9.5571 (10.2435)   Mask: 0.4582 (0.5145)   Pts: 0.1871 (0.2308)    Graph: 20.3408 (20.5687)    Total: 5.3185 (5.9037)
Epoch: [0][18/19]   Time: 0.126 (0.216) Sym: 9.7067 (10.2262)   Mask: 0.4603 (0.5128)   Pts: 0.1724 (0.2289)    Graph: 18.7729 (20.5108)    Total: 5.0324 (5.8756)
Epoch: [1][0/19]    Time: 0.198 (0.198) Sym: 10.3333 (10.3333)  Mask: 0.4550 (0.4550)   Pts: 0.1718 (0.1718)    Graph: 20.8354 (20.8354)    Total: 5.2897 (5.2897)
Epoch: [1][1/19]    Time: 0.234 (0.216) Sym: 8.6401 (9.4867)    Mask: 0.4459 (0.4505)   Pts: 0.1651 (0.1685)    Graph: 22.2583 (21.5469)    Total: 5.1870 (5.2384)
Epoch: [1][2/19]    Time: 0.227 (0.220) Sym: 10.5686 (9.8473)   Mask: 0.4388 (0.4466)   Pts: 0.1731 (0.1700)    Graph: 20.7408 (21.2782)    Total: 5.3007 (5.2592)
Epoch: [1][3/19]    Time: 0.223 (0.220) Sym: 12.1516 (10.4234)  Mask: 0.4363 (0.4440)   Pts: 0.1882 (0.1745)    Graph: 23.5865 (21.8552)    Total: 5.8917 (5.4173)
Epoch: [1][4/19]    Time: 0.136 (0.204) Sym: 11.4421 (10.6272)  Mask: 0.4430 (0.4438)   Pts: 0.1606 (0.1718)    Graph: 21.6162 (21.8074)    Total: 5.3553 (5.4049)
Epoch: [1][5/19]    Time: 0.256 (0.212) Sym: 11.2052 (10.7235)  Mask: 0.4236 (0.4404)   Pts: 0.1628 (0.1703)    Graph: 21.7234 (21.7934)    Total: 5.3448 (5.3949)
Epoch: [1][6/19]    Time: 0.177 (0.207) Sym: 11.1202 (10.7802)  Mask: 0.4167 (0.4370)   Pts: 0.1628 (0.1692)    Graph: 19.3867 (21.4496)    Total: 5.0954 (5.3521)
Epoch: [1][7/19]    Time: 0.141 (0.199) Sym: 12.6337 (11.0119)  Mask: 0.4218 (0.4351)   Pts: 0.1665 (0.1689)    Graph: 21.4015 (21.4436)    Total: 5.4901 (5.3693)
Epoch: [1][8/19]    Time: 0.206 (0.200) Sym: 10.4312 (10.9473)  Mask: 0.4093 (0.4323)   Pts: 0.1619 (0.1681)    Graph: 18.8918 (21.1601)    Total: 4.9605 (5.3239)
Epoch: [1][9/19]    Time: 0.121 (0.192) Sym: 11.8991 (11.0425)  Mask: 0.4211 (0.4312)   Pts: 0.1485 (0.1661)    Graph: 20.1577 (21.0598)    Total: 5.1121 (5.3027)
Epoch: [1][10/19]   Time: 0.170 (0.190) Sym: 11.1424 (11.0516)  Mask: 0.4067 (0.4289)   Pts: 0.1485 (0.1645)    Graph: 21.1836 (21.0711)    Total: 5.1244 (5.2865)
Epoch: [1][11/19]   Time: 0.200 (0.191) Sym: 10.0255 (10.9661)  Mask: 0.4028 (0.4268)   Pts: 0.1495 (0.1633)    Graph: 21.2908 (21.0894)    Total: 5.0296 (5.2651)
Epoch: [1][12/19]   Time: 0.215 (0.193) Sym: 11.2518 (10.9881)  Mask: 0.3952 (0.4243)   Pts: 0.1328 (0.1609)    Graph: 19.7814 (20.9888)    Total: 4.8262 (5.2314)
chensong1995 commented 11 months ago

Hi Mona,

Thanks for your question! The command is correct. I believe the Testing... output is from the debugging logic you added in another thread.

I hope this helps! Let me know if you have further concerns.

monajalal commented 11 months ago

Screenshot from 2023-12-13 13-48-49

It was part of the original code. Do you know why it prints it if I am training?

chensong1995 commented 11 months ago

Hi Mona,

Thanks for the follow-up. The call to print() is indeed part of the original code. However, in the other thread, I recommended you to add trainer.test(0) in train_core.py before training.

I hope this helps! Let me know if you have further concerns.