lucidrains / local-attention

An implementation of local windowed attention for language modeling
MIT License
383 stars 40 forks source link

Include einops in setup script #8

Closed Mindful closed 3 years ago

Mindful commented 3 years ago

After recent updates, the package depends on einops here but doesn't include it as a requirement

lucidrains commented 3 years ago

@Mindful oops, thanks for this Josh! (I actually just removed einops as a dependency anyways)