EleutherAI / elk

Keeping language models honest by directly eliciting knowledge encoded in their activations.
MIT License
175 stars 32 forks source link

remove duplicate normalization #274

Closed lauritowal closed 1 year ago

lauritowal commented 1 year ago

We are already doing self.norm(x) in fit, why are we running it again in forward. I think that's a mistake (?) That's why I've removed it from forward