TransformerLensOrg / TransformerLens

A library for mechanistic interpretability of GPT-style language models
https://transformerlensorg.github.io/TransformerLens/
MIT License
1.55k stars 302 forks source link

[Proposal] Add frequency-based RoPE support for Llama 3.1 models #719

Closed frances720 closed 4 days ago

frances720 commented 2 months ago

Proposal

Add support for frequency-based RoPE (Rotary Position Embedding) smoothing in the TransformerLens library to match Llama 3.1’s architecture.

Motivation

Llama 3.1 uses frequency-based smoothing in its positional embeddings to handle long-range dependencies more effectively. However, the current version of TransformerLens does not support this feature, limiting the ability to properly analyze Llama 3.1 models.

Pitch

Implement frequency-based RoPE smoothing to enhance positional encoding in Llama 3.1 models. This would improve TransformerLens’s compatibility with Llama 3.1 and provide a better tool for analyzing long-sequence tasks.

Alternatives

Continue using TransformerLens with standard RoPE, but this would not fully support Llama 3.1’s unique architecture.

Screen Shot 2024-09-08 at 10 09 07 PM

Checklist

frances720 commented 2 months ago

I have a PR for it but when I ran git push --set-upstream origin frances/llama31_rope it returned 403

bryce13950 commented 1 month ago

@frances720 Sorry for the late reply! It appears that you may be trying to write your branch to the TransformerLens repo? You need to make your PR from your fork. If you need help on this, you can reach me on the slack channel. Let me know if you need an invite!

bryce13950 commented 4 days ago

This has been resolved in a recent release