xiaohu2015 / nngen

Apache License 2.0
446 stars 75 forks source link

position embedding? i cannot understand,the code? #1

Open henbucuoshanghai opened 5 months ago

henbucuoshanghai commented 5 months ago

half = dim // 2 freqs = torch.exp( -math.log(max_period) torch.arange(start=0, end=half, dtype=torch.float32) / half ).to(device=timesteps.device) args = timesteps[:, None].float() freqs[None] embedding = torch.cat([torch.cos(args), torch.sin(args)], dim=-1)

henbucuoshanghai commented 5 months ago

can u explain it,tks

xiaohu2015 commented 5 months ago

you can refer to transformer paper

henbucuoshanghai commented 5 months ago

whereis 10000?