NaJaeMin92 / FixBi

Official code for the CVPR 2021 paper "FixBi: Bridging Domain Spaces for Unsupervised Domain Adaptation"
64 stars 11 forks source link

Pretrained baseline weights #14

Open Hayoung93 opened 2 years ago

Hayoung93 commented 2 years ago

@NaJaeMin92 Hi, I read FixBi interestingly and want to reproduce some results.

Can you provide .pt files? https://github.com/NaJaeMin92/FixBi/blob/fdf370d803ac2d5293035104b76642c812ac00a7/src/utils.py#L36-L42 (net.pt, head.pt, classifier.pt)

You mentioned this repo for pretrained weights, but to me, it seems two repo has different network architecture (so cannot be loaded even if I train using the mentioned repo).

Should I edit networks' architecture of DANN (as same as this repo), train it on the Office-31 dataset, and use it as a pretrained baseline weight?

Thank you for your work and I will wait for your response.

Arsiuuu commented 1 year ago

Hi, have you found the pretrained models?

Hayoung93 commented 1 year ago

@Eureka-JTX Unfortunately no. I'd excluded loading part and tried to train from scratch but failed (I assume pretrained weights are nessasary, note that authors wrote we start to train our networks with pretrained baseline weights at the paper).

I could try training those weight for my own, however I have less motivation to do such work, so currently trying to reproduce other domain adaptation papers' results.

Arsiuuu commented 1 year ago

@Hayoung93 Thanks for your patient answer! I've reproduced the MSTN from DSBN(https://github.com/wgchang/DSBN), but I only obtained the weights of encoder and discriminator rather than head and classifier in this repo, so the result is even lower than DSBN.

xixi1998 commented 1 year ago

no pretrained model weights. so sad.

toshi2k2 commented 1 year ago

The results of this work seem non - reproducible.