bbepoch / HoiTransformer

This is the code for HOI Transformer
Apache License 2.0
141 stars 22 forks source link

the test results on HICO-Det and V-COCO after training 150 epochs are very low, it doesn't seem to converge. #26

Closed truetone2022 closed 3 years ago

truetone2022 commented 3 years ago

I used 32 A100 gpu, batch size per gpu is 6, learing rate is 12e-4, learning rate of backbone is 12e-5, other settings are same.How can i get the correct result? Any helpful advise is appreciated! image

bbepoch commented 3 years ago

I suggest you do experiments with the same setting at first. And change the settings only when you get the desired results. Or there may be some environment related problems to solve.