kakaobrain / hotr

Official repository for HOTR: End-to-End Human-Object Interaction Detection with Transformers (CVPR'21, Oral Presentation)
Apache License 2.0
138 stars 19 forks source link

pretrained detr on hicodet #14

Closed SISTMrL closed 3 years ago

SISTMrL commented 3 years ago

hello, have you tried to finetune the whole structure of detr on hicodet datasets later? i want to know the performance gap with detr without finetune.

bmsookim commented 3 years ago

@SISTMrL Our evaluation showed that pre-training the detector in the HICO-DET dataset and fine-tuning it shows an mAP over 26, but we rather did not include it because of its unnecessarily inefficient training procedure (as we obtained comparable performance only by jointly training the final FFN layer of the detector).

We can include the pre-trained weights in our repo, but it will be during October since me and my resources are currently occupied in preparing our next paper! As our detector follows the exact parameter names and structure of DETR, you can fine-tune it yourself and feed the weights by --resume option!

SISTMrL commented 3 years ago

@meliketoy thanks very much!

widedh commented 1 year ago

Hi,

Can I get the final models hico_q16.pth and vcoco_q16.pth, please ?