AnswerDotAI / bert24

Apache License 2.0
60 stars 3 forks source link

Add missing bias config options to attention Linear layers #79

Closed warner-benjamin closed 3 months ago

warner-benjamin commented 3 months ago

A handful of Attention layers did not surface the attn_qkv_bias and attn_out_bias config options to their Linear layers.

NohTow commented 3 months ago

LGTM Thankfully this is only for padded versions (since the default of nn.Linear is True).