jonasrauber / eagerpy

PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code
https://eagerpy.jonasrauber.de
MIT License
695 stars 40 forks source link

Correctly compute gradients for torch backend #55

Closed zimmerrol closed 2 years ago

zimmerrol commented 2 years ago

Currently, the computation of gradients using value_and_grad for torch tensors has the problem of storing the gradients in the model. This can become an issue for adversarial training. This PR is meant to solve this issue.

This PR also updates the required versions of tensorflow and black to solve some incompatibility issues with packages they depend on.

codecov[bot] commented 2 years ago

Codecov Report

Merging #55 (724f64b) into master (352220e) will decrease coverage by 0.00%. The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master      #55      +/-   ##
==========================================
- Coverage   99.89%   99.89%   -0.01%     
==========================================
  Files          16       16              
  Lines        1900     1899       -1     
==========================================
- Hits         1898     1897       -1     
  Misses          2        2              
Impacted Files Coverage Δ
eagerpy/tensor/pytorch.py 100.00% <100.00%> (ø)

:mega: Codecov can now indicate which changes are the most critical in Pull Requests. Learn more

zimmerrol commented 2 years ago

@jonasrauber Can you please review this PR?