xiaohu2015 / nngen

Apache License 2.0
461 stars 75 forks source link

position embedding? i cannot understand,the code? #1

Open henbucuoshanghai opened 7 months ago

henbucuoshanghai commented 7 months ago

half = dim // 2 freqs = torch.exp( -math.log(max_period) torch.arange(start=0, end=half, dtype=torch.float32) / half ).to(device=timesteps.device) args = timesteps[:, None].float() freqs[None] embedding = torch.cat([torch.cos(args), torch.sin(args)], dim=-1)

henbucuoshanghai commented 7 months ago

can u explain it,tks

xiaohu2015 commented 7 months ago

you can refer to transformer paper

henbucuoshanghai commented 7 months ago

whereis 10000?