issues
search
jbloomAus
/
SAELens
Training Sparse Autoencoders on Language Models
https://jbloomaus.github.io/SAELens/
MIT License
184
stars
65
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Expanding SAE evals
#196
JoshEngels
opened
2 days ago
0
feat: new saes for gemma-2b-it and feature splitting on gpt2-small-la…
#195
jbloomAus
closed
3 days ago
0
fix hook z loader
#194
jbloomAus
closed
4 days ago
0
[Bug Report] SAE.from_pretrained errors out in Hooked_SAE_Transformer_Demo.ipynb
#193
babcockt18
closed
4 days ago
2
How to train SAEs on my own model?
#192
likangk
opened
1 week ago
1
How to reproduce jbloom/Gemma-2b-Residual-Stream-SAEs
#191
zdaiot
opened
1 week ago
5
[Proposal] Add note that Windows unsupported
#190
MClarke1991
opened
1 week ago
1
Performance improvements + using multiple GPUs.
#189
jbloomAus
closed
1 week ago
1
feat: Support Gated-SAEs
#188
curt-tigges
closed
4 days ago
1
fix: allow settings trust_remote_code for new huggingface version
#187
chanind
closed
1 week ago
4
[Bug Report] ActivationsStore fails for models without a tokenizer
#186
chanind
opened
1 week ago
0
[Proposal] Allow loading TransformerLens models saved locally
#185
chanind
opened
1 week ago
0
fix: allow setting trust_remote_code for new huggingface version
#184
chanind
closed
1 week ago
1
[WIP] feat: add mlp transcoders
#183
dtch1997
opened
1 week ago
4
[Proposal] Add MLP transcoders
#182
dtch1997
opened
1 week ago
1
feat: harmize activation store and pretokenize runner
#181
chanind
opened
2 weeks ago
5
[Bug Report] docs: Slack link is broken
#180
domdomegg
closed
1 week ago
2
add expected perf for pretrained
#179
jbloomAus
closed
2 weeks ago
1
Adding Mistral SAEs
#178
JoshEngels
closed
2 weeks ago
5
attempt to train sae for othello-gpt model
#177
thijmennijdam
closed
1 week ago
2
feat: updating training docs and standardizing pretokenize runner arch
#176
chanind
closed
2 weeks ago
3
SAE steering vector tutorial
#175
NelsonG-C
closed
2 weeks ago
1
[Bug Report] make check-ci fails with pyright 1.1.366
#174
jettjaniak
opened
3 weeks ago
1
Use latest versions for packages in colab tutorials
#173
NelsonG-C
closed
3 weeks ago
0
Fix pip install in HookedSAETransformer Demo
#172
ckkissane
closed
3 weeks ago
2
fix: progress bar updates
#171
RoganInglis
closed
2 weeks ago
3
feat: activation norm scaling factor folding
#170
jbloomAus
closed
4 weeks ago
1
fix: share config defaulting between hf and local loading
#169
jbloomAus
closed
1 month ago
1
[Bug Report] Saved Models Cannot be Loaded
#168
4gatepylon
closed
1 month ago
5
feat: add w_dec_norm folding
#167
jbloomAus
closed
1 month ago
1
Fixed typo in Hooked_SAE_Transformer_Demo.ipynb colab badge
#166
ianand
closed
1 month ago
1
Fix hook z training reshape bug
#165
jbloomAus
closed
1 month ago
0
[Bug Report] Cannot train on z: interfaces disagree on shape?
#164
4gatepylon
closed
1 month ago
3
[Bug Report] evals.py ablates all heads when it needs to ablate only one
#163
shehper
opened
1 month ago
3
Quality of Life Refactor of SAE Lens adding SAE Analysis with HookedSAETransformer and some other breaking changes.
#162
jbloomAus
closed
1 month ago
2
[Bug Report] Cannot find `train_batch_size` in `LanguageModelSAERunnerConfig.__init__()`
#161
MClarke1991
closed
1 month ago
2
HookedSAETransformer
#160
ckkissane
closed
1 month ago
2
Move activation store to cpu
#159
tomMcGrath
closed
1 month ago
5
Refactor training
#158
jbloomAus
closed
1 month ago
1
Enable autocast for LM activation creation
#157
tomMcGrath
closed
1 month ago
1
remove resuming ability, keep resume config but complain if true
#156
jbloomAus
closed
1 month ago
1
Remove sae parallel training, simplify code
#155
jbloomAus
closed
1 month ago
1
Add notebook to transfer W&B models to HF
#154
tomMcGrath
closed
1 month ago
1
Ansible: dev only mode
#153
hijohnnylin
closed
1 month ago
1
add gemma-2b bootleg saes
#152
jbloomAus
closed
1 month ago
0
[Proposal] tidy normalisation code
#151
tomMcGrath
opened
1 month ago
0
Fix normalisation
#150
tomMcGrath
closed
1 month ago
2
[Proposal] We should make a more comprehensive / details contribution guide.
#149
jbloomAus
opened
1 month ago
0
Pretokenize runner
#148
chanind
closed
1 month ago
2
fix GPT2 sweep settings to use correct dataset
#147
tomMcGrath
closed
1 month ago
2
Next