Open ice-tong opened 3 months ago
torch.amax
The difference between max/min and amax/amin is: amax/amin supports reducing on multiple dimensions, amax/amin does not return indices, amax/amin evenly distributes gradient between equal values, while max(dim)/min(dim) propagates gradient only to a single index in the source tensor.
The difference between max/min and amax/amin is:
ppq/executor/op/torch/default.py
Motivation
torch.amax
in ReduceMax forward to support multiplie dim.torch.amax
can be found at: https://pytorch.org/docs/stable/generated/torch.amax.html#torch-amaxModification
ppq/executor/op/torch/default.py