Closed orausch closed 2 years ago
Merging #114 (34d6184) into master (febd2dc) will increase coverage by
0.35%
. The diff coverage is96.24%
.
@@ Coverage Diff @@
## master #114 +/- ##
==========================================
+ Coverage 68.32% 68.67% +0.35%
==========================================
Files 60 65 +5
Lines 6894 7232 +338
==========================================
+ Hits 4710 4966 +256
- Misses 2184 2266 +82
Impacted Files | Coverage Δ | |
---|---|---|
...l/onnx/op_implementations/cudnn_implementations.py | 88.94% <66.67%> (+0.27%) |
:arrow_up: |
daceml/util/utils.py | 71.85% <83.33%> (+0.42%) |
:arrow_up: |
daceml/autodiff/backward_pass_generator.py | 92.55% <91.67%> (+0.13%) |
:arrow_up: |
daceml/autodiff/analysis.py | 92.86% <92.86%> (ø) |
|
daceml/autodiff/library/python_frontend.py | 94.94% <94.94%> (ø) |
|
daceml/autodiff/library/library.py | 98.58% <98.58%> (ø) |
|
daceml/autodiff/__init__.py | 100.00% <100.00%> (ø) |
|
daceml/autodiff/implementations/dace_nodes.py | 97.56% <100.00%> (+0.06%) |
:arrow_up: |
daceml/autodiff/library/__init__.py | 100.00% <100.00%> (ø) |
|
daceml/autodiff/library/torch_integration.py | 100.00% <100.00%> (ø) |
|
... and 11 more |
:mega: Codecov can now indicate which changes are the most critical in Pull Requests. Learn more
This adds support for the
torch.autograd.backward
function, as well as access to.grad
of an array.Calls to
torch.autograd.backward
inserts a newBackwardPass
library node.Calls to
.grad
allocate gradient buffers and write to them via these library nodes.Example:
Outputs: Which expands to: