Closed ZHUXUHAN closed 2 years ago
Actually, GPSNet is not the best plain baseline in Table 1. This result is directly adopted from the paper "Bipartite Graph Network with Adaptive Message Passing for Unbiased Scene Graph Generation", which is with a resampling augmentation. Sorry for my carelessness of forgetting to mark this. I will revise this later.
We only provide a non-resampling version in the current code (bash cmds/50/gpsnet/predcls/sup/train.sh
), which is around 66.91 for R@100 and 16.04 for mR@100 of the PREDCLS task. To get the resampling code, you can go through https://github.com/SHTUPLUS/PySGG for more tails.
dear sir, as your say, without any augmentation strategy, is Transformer the best plain baseline for all PREDCLS, SGCLS, SGDet tasks? And now the current code is the most primitive version for baseline models and if I can know whether the resampling strategy improve the results as reweighting strategy. thanks
Considering the F@100, the transformer is the best. For this version of the code, the transformer for PREDCLS can roughly get 65.7 R@100 and 69.6 mR@100. I do not try a resampling strategy in my work. But from the "Bipartite Graph Network with Adaptive Message Passing for Unbiased Scene Graph Generation" Table 1, GPSNet with resampling is better than the plain one.
已收到!
thans for your reply for solveing one of my big confusions!
Considering the F@100, the transformer is the best. For this version of the code, the transformer for PREDCLS can roughly get 65.7 R@100 and 69.6 mR@100. I do not try a resampling strategy in my work. But from the "Bipartite Graph Network with Adaptive Message Passing for Unbiased Scene Graph Generation" Table 1, GPSNet with resampling is better than the plain one.
maybe the mR@100 is wrong? it is so high.
Sorry, it is 19.6.
dear sir, i see the Table 1. in your paper, in all plain baseline, the gpsnet's performance is best. i just hope to reproduce this accuracy, can you tell me what should I do and if I can get your log file about this model, if it's still there. thanks.