-
# Questions
Dear authors of the SAE Lens,
Thanks for your amazing work. I saw on the Neuronpedia and Huggingface that the Llama-3-8B-IT SAE published by Julius Han is currently available. However,…
-
-
Trying to repro the chess SAE trainining:
```
python circuits/sae_training/chess_sae_trainer.py --save_dir=/tmp/sae_debug
```
After modifying this line to pass the `meta.pkl` from `circuits/reso…
-
Our current approach embeds datasets using Sentence Transformers that give us one embedding per "chunk" of text (so if we pass in 500 tokens of text or 100 tokens of text we always get 1 embedding). S…
-
Just curious if anyone is thinking about implementing a training pipeline for JumpReLU SAEs! They have a couple of properties which are really desirable for something I'm working on.
-
It was observed today that a HiSuite TV had trouble connecting
Symptoms:
- No WiFI Auth Failure Event
- If wrong PSK is set on TV, then it will send an auth failure
- Tcpdump on a monitor interf…
-
Hi, while executing:
`torchrun --nproc_per_node gpu -m sae meta-llama/Meta-Llama-3-8B --distribute_modules --batch_size 1 --layers 24 25 --grad_acc_steps 8 --ctx_len 2048 --k 192 --load_in_8bit --mic…
-
while trying to run this ```https://github.com/EleutherAI/sae-auto-interp/blob/main/demo/simulate.py``` script got this error ImportError: cannot import name 'FeatureLoader' from 'sae_auto_interp.fea…
-
when I'm calling `Sae.decode()`, it raises error:
Traceback (most recent call last):
File "", line 21, in triton_sparse_dense_matmul_kernel
KeyError: ('2-.-0-.-0-d82511111ad128294e9d31a6ac68423…
zlkqz updated
2 weeks ago
-
AssertionError: If encoder isn't an AutoEncoder, it should have weights 'W_enc', 'W_dec', 'b_enc', 'b_dec'
Gated SAE do not have b_enc and it seems AutoEncoder is not suitable for gated SAE.