jbloomAus / SAELens

Training Sparse Autoencoders on Language Models
https://jbloomaus.github.io/SAELens/
MIT License
386 stars 106 forks source link

fix: share config defaulting between hf and local loading #169

Closed jbloomAus closed 4 months ago

jbloomAus commented 4 months ago

Description

We temporarily had a situation where SAE.load_from_pretrained wasn't using the defaulting system we've got for from_pretrained (saes on HuggingFace). This refactor just moves the defaults (temporary hack) to a common function).

Fixes #168

Type of change

Please delete options that are not relevant.

codecov[bot] commented 4 months ago

Codecov Report

Attention: Patch coverage is 74.07407% with 7 lines in your changes are missing coverage. Please review.

Project coverage is 58.54%. Comparing base (f1908a3) to head (902fa49). Report is 1 commits behind head on main.

Files Patch % Lines
sae_lens/toolkit/pretrained_sae_loaders.py 74.07% 6 Missing and 1 partial :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #169 +/- ## ========================================== + Coverage 56.35% 58.54% +2.18% ========================================== Files 25 25 Lines 2603 2603 Branches 440 440 ========================================== + Hits 1467 1524 +57 + Misses 1061 1002 -59 - Partials 75 77 +2 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.