Closed corochann closed 5 years ago
Merging #285 into master will increase coverage by
0.44%
. The diff coverage is100%
.
@@ Coverage Diff @@
## master #285 +/- ##
==========================================
+ Coverage 83.24% 83.69% +0.44%
==========================================
Files 147 165 +18
Lines 7092 7643 +551
==========================================
+ Hits 5904 6397 +493
- Misses 1188 1246 +58
This PR is for when one want to backward output_var other than loss
.
For example, if we want to calculate the gradient of output, not loss we need to write eval_fun
as follows.
def eval_fun(x, adj, t):
pred = predictor(x, adj)
pred_summed = F.sum(pred)
return pred_summed
But usually, we forget to write F.sum
, and try to return pred
which is multi-dimensional along batch size and backward does not work.
TODO: test
test added, please check @mottodora
Codecov Report