jbloomAus / SAELens

Training Sparse Autoencoders on Language Models
https://jbloomaus.github.io/SAELens/
MIT License
386 stars 106 forks source link

Fix pip install in HookedSAETransformer Demo #172

Closed ckkissane closed 3 months ago

ckkissane commented 3 months ago

Description

The current HookedSAETransformer demo is broken in colab

It was pip installing from the old TransformerLens branch rather than SAELens. This PR fixes that, and also updates some of the language in the demo to match SAELens (eg HookedSAE -> SAE).

Type of change

Please delete options that are not relevant.

Checklist:

You have tested formatting, typing and unit tests (acceptance tests not currently in use)

Performance Check.

If you have implemented a training change, please indicate precisely how performance changes with respect to the following metrics:

Please links to wandb dashboards with a control and test group.

codecov[bot] commented 3 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 59.11%. Comparing base (43c93e2) to head (a4bb5f6). Report is 1 commits behind head on main.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #172 +/- ## ======================================= Coverage 59.11% 59.11% ======================================= Files 25 25 Lines 2595 2595 Branches 439 439 ======================================= Hits 1534 1534 Misses 984 984 Partials 77 77 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

jbloomAus commented 3 months ago

Ahh sorry! I thought I'd fixed that!