Lightning-AI / pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.5k stars 3.39k forks source link

Add support for forward mode automatic differentiation #12422

Closed tchaton closed 1 year ago

tchaton commented 2 years ago

🚀 Feature

Motivation

From the Gradients without Backpropagation (code is coming) and Forward AD in PyTorch FuncTorch, it seems significant optimization could be made without actually performing the backward pass.

From the paper:

Screenshot 2022-03-23 at 11 59 48
We implement a forward-mode AD system in Python and
base this on PyTorch tensors in order to enable a fair comparison with a typical backpropagation pipeline in PyTorch,
which is widely used by the ML community.9 We release
our implementation publicly.10
Our forward-mode AD engine is implemented from scratch
using operator overloading and non-differentiable PyTorch
tensors (requires grad=False) as a building block.
This means that our forward AD implementation does not
use PyTorch’s reverse-mode implementation (called “autograd”) and computation graph. We produce the backpropagation results in experiments using PyTorch’s existing reverse-mode code (requires grad=True and
.backward()) as usual.

Pitch

Alternatives

Additional context


If you enjoy Lightning, check out our other projects! ⚡

cc @borda @rohitgr7 @akihironitta

carmocca commented 1 year ago

There's nothing to action here. Closing