Closed nihirv closed 1 year ago
Sorry for the bug. The weight parameter is only for improved version GGD (with --debias decompose
and base_model_sfce). You can just fix this bug by deleting weight
in base_model.py as our new update or adding weight=1 as default. It has nothing to do with loss = F.binary_cross_entropy_with_logits(logits, y_gradient)
Hi
If I run the code with the following command:
python main.py --dataset cpv2 --mode gge_iter --debias gradient --topq 1 --topv -1 --qvp 5 --output []
(i.e.
--mode gge_iter
and--debias gradient
), then I get an error on line 81 of base_model.py:loss = self.debias_loss_fn(None, logits, ref_logits, labels, weight)
To fix this, I've added
weight
as a parameter ofGreedyGradient
invqa_debias_loss_functions.py
, and multiplied the output of the BCE loss by weight:loss = F.binary_cross_entropy_with_logits(logits, y_gradient) * weight
I couldn't find your weighting/scaling factor in the paper, so please let me know if that's correct