Closed iclementine closed 2 months ago
Operator
Bug Fix
fix layer_norm_backward: save mean and rstd in float32 when the inputs are in half-precision floating point dtypes to avoid numerical instability or errors
NOTE: Aten's implementation also saves mean and rstd in fp32 in these cases
Known Issue: Backward pass cannot be tracked by coverage.
PR Category
Operator
Type of Change
Bug Fix
Description
fix layer_norm_backward: save mean and rstd in float32 when the inputs are in half-precision floating point dtypes to avoid numerical instability or errors
NOTE: Aten's implementation also saves mean and rstd in fp32 in these cases
Issue
Progress
Performance