Closed qhykwsw closed 6 years ago
Hi, so sorry to bother you again. But I got a strange thing shouldn't happen. After I did the change mentioned above, the code can run normally. However, when I finished my training process, I found the comref's results are better than the com's results on the three datasets. Logically speaking, the results after com refinement should be better than without the refinement. What do you think of it?
Yes, the results with comref should be better than with com. We reported this improvement in our paper.
Oh, sorry, I wrote it wrongly. In fact, the comref's results are worse than the com's results on the three datasets, which nearlly 10~20%.
That is strange, it should not be like that. Did you also train the comref networks?
qianhongyi notifications@github.com schrieb am Fr., 29. Juni 2018 7:52 nachm.:
Oh, sorry, I wrote it wrongly. In fact, the comref's results are worse than the com's results on the three datasets, which nearlly 10~20%.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/moberweger/deep-prior-pp/issues/18#issuecomment-401509614, or mute the thread https://github.com/notifications/unsubscribe-auth/AKEhXJghHkvwqWHungy9lGHHPZF0IsY6ks5uBtn3gaJpZM4U5Al- .
Yeah, I trained the comref networks. For the ICVL dataset, I first run the main_icvl_com_refine.py and got the net_ICVL_COM_AUGMENT.pkl. Then I changed "comref = None docom = False" to "comref = "./eval/NYU_COM_AUGMENT/net_NYU_COM_AUGMENT.pkl" docom = True"
Well, do the errors of the detection decrease when training the comref?
In the code you pasted, you mixed the nyu and icvl, you need a separate comref network for each dataset.
qianhongyi notifications@github.com schrieb am Sa., 30. Juni 2018 4:00 vorm.:
Yeah, I trained the comref networks. For the ICVL dataset, I first run the main_icvl_com_refine.py and got the net_ICVL_COM_AUGMENT.pkl. Then I changed "comref = None docom = False" to "comref = "./eval/NYU_COM_AUGMENT/net_NYU_COM_AUGMENT.pkl" docom = True"
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/moberweger/deep-prior-pp/issues/18#issuecomment-401531105, or mute the thread https://github.com/notifications/unsubscribe-auth/AKEhXM_LDDsW7sCocDFiAPAAG2BQQouPks5uB0xHgaJpZM4U5Al- .
Oh, sorry, it's my carelessness. I pasted the code wrongly...In fact, I didn't garble the nyu and icvl. And the errors of the detection do decrease when training.
OK, this is good then. But I do not see why the joint error should not decrease with a better detection, though.
qianhongyi notifications@github.com schrieb am Sa., 30. Juni 2018 6:36 vorm.:
Oh, sorry, it's my carelessness. I pasted the code wrongly...In fact, I didn't garble the nyu and icvl. And the errors of the detection do decrease when training.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/moberweger/deep-prior-pp/issues/18#issuecomment-401538555, or mute the thread https://github.com/notifications/unsubscribe-auth/AKEhXFezrdMg3v2o-AIt3STGMSSWwt5hks5uB3DCgaJpZM4U5Al- .
Sorry , after read your code again I finally found why, I made another stupid mistake. In fact, I didn't use the 'com' mode. The better result came from the 'groundtruth' mode and the worse result came from 'comref' mode. This makes sense.
HeIlo, I found the code's default mode is "com". I want to use the "comref" mode, so I changed "comref = None docom = False" to "comref = "./eval/NYU_COM_AUGMENT/net_NYU_COM_AUGMENT.pkl" docom = True"
Then, I got an read error: Traceback (most recent call last): File "main_nyu_posereg_embedding.py", line 55, in
Seq1 = di.loadSequence('train', shuffle=True, rng=rng, docom=docom)
File "/home/hyqian/projects/deep-prior-pp/src/data/importers.py", line 975, in loadSequence
self.loadRefineNetLazy(self.refineNet)
File "/home/hyqian/projects/deep-prior-pp/src/data/importers.py", line 180, in loadRefineNetLazy
numJoints=1, nDims=3)
File "/home/hyqian/projects/deep-prior-pp/src/net/scalenet.py", line 132, in init
raise NotImplementedError("not implemented")
NotImplementedError: not implemented
After tracing the source, I found the root cause of the problem is the line 179 in the importers.py:
"comrefNetParams = ScaleNetParams(type=5, nChan=1, wIn=128, hIn=128, batchSize=1, resizeFactor=2, numJoints=1, nDims=3)"
the "type=5" should be changed to "type=1", did I do the right thing or I still miss somehing?