In class ReformerLM,there are 2 parameters about dim,emb_dim and 'dim'. i find you use emb_dim = default(emb_dim, dim).Is it default that emb_dim and dim are equal.i try make them diff,then erro occurs.self.norm = nn.LayerNorm(emb_dim) caused the problem i think.
In
class ReformerLM
,there are 2 parameters about dim,emb_dim
and 'dim'. i find you useemb_dim = default(emb_dim, dim)
.Is it default thatemb_dim
anddim
are equal.i try make them diff,then erro occurs.self.norm = nn.LayerNorm(emb_dim)
caused the problem i think.