Closed frances720 closed 4 days ago
I have a PR for it but when I ran git push --set-upstream origin frances/llama31_rope
it returned 403
@frances720 Sorry for the late reply! It appears that you may be trying to write your branch to the TransformerLens repo? You need to make your PR from your fork. If you need help on this, you can reach me on the slack channel. Let me know if you need an invite!
This has been resolved in a recent release
Proposal
Add support for frequency-based RoPE (Rotary Position Embedding) smoothing in the TransformerLens library to match Llama 3.1’s architecture.
Motivation
Llama 3.1 uses frequency-based smoothing in its positional embeddings to handle long-range dependencies more effectively. However, the current version of TransformerLens does not support this feature, limiting the ability to properly analyze Llama 3.1 models.
Pitch
Implement frequency-based RoPE smoothing to enhance positional encoding in Llama 3.1 models. This would improve TransformerLens’s compatibility with Llama 3.1 and provide a better tool for analyzing long-sequence tasks.
Alternatives
Continue using TransformerLens with standard RoPE, but this would not fully support Llama 3.1’s unique architecture.
Checklist