jbloomAus / SAELens

Training Sparse Autoencoders on Language Models
https://jbloomaus.github.io/SAELens/
MIT License
386 stars 106 forks source link

add expected perf for pretrained #179

Closed jbloomAus closed 3 months ago

jbloomAus commented 3 months ago

Description

From pretrained.yaml now stores the expected L0 / variance explained on a test example in tests/benchmark. This is a first step towards having pre-trained model loading in CI. We'll likely need a GPU runner that can cache downloaded SAEs to do this properly.

codecov[bot] commented 3 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 59.34%. Comparing base (03f071b) to head (125926e). Report is 2 commits behind head on main.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #179 +/- ## ========================================== + Coverage 59.25% 59.34% +0.09% ========================================== Files 25 25 Lines 2604 2610 +6 Branches 440 440 ========================================== + Hits 1543 1549 +6 Misses 984 984 Partials 77 77 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.