Xilinx / brevitas

Brevitas: neural network quantization in PyTorch
https://xilinx.github.io/brevitas/
Other
1.21k stars 197 forks source link

QAT references #293

Open volcacius opened 3 years ago

volcacius commented 3 years ago

Papers:

volcacius commented 3 years ago

Soft rounding:

#Fwd
def soft_round(x, alpha):
    x_floor = torch.floor(x)
    r = x - x_floor - 0.5
    out = x_floor + 0.5 * torch.tanh(alpha * r) / torch.tanh(alpha * 0.5) + 0.5
    return out