Xilinx / brevitas

Brevitas: neural network quantization in PyTorch
https://xilinx.github.io/brevitas/
Other
1.14k stars 189 forks source link

QAT references #293

Open volcacius opened 3 years ago

volcacius commented 3 years ago

Papers:

volcacius commented 3 years ago

Soft rounding:

#Fwd
def soft_round(x, alpha):
    x_floor = torch.floor(x)
    r = x - x_floor - 0.5
    out = x_floor + 0.5 * torch.tanh(alpha * r) / torch.tanh(alpha * 0.5) + 0.5
    return out