lucidrains / x-transformers

A simple but complete full-attention transformer with a set of promising experimental features from various papers
MIT License
4.42k stars 377 forks source link

Was it a clerical error ? ScaleNorm.g init form dim ** -0.5. I think it should be dim ** 0.5 #246

Closed junphine closed 3 months ago

junphine commented 4 months ago

class ScaleNorm(nn.Module): def init(self, dim, eps = 1e-5): super().init() self.eps = eps self.g = nn.Parameter(torch.ones(1) * (dim ** -0.5))

lucidrains commented 3 months ago

@junphine hey, thank you for catching this! indeed the sign was not correct

it should be identical to rmsnorm except it is a single learned parameter rather than the model dimension