chengchingwen / Transformers.jl

Julia Implementation of Transformer models
MIT License
523 stars 74 forks source link

fix typos #130

Closed ymtoo closed 1 year ago

ymtoo commented 1 year ago

Fixes

julia> Layers.SelfAttention(8, 512)
ERROR: UndefVarError: hidden_state not defined
Stacktrace:
 [1] Transformers.Layers.SelfAttention(head::Int64, hidden_size::Int64; dropout::Nothing, return_score::Bool, causal::Bool)
   @ Transformers.Layers ~/.julia/packages/Transformers/nIgPX/src/layers/layer.jl:285
 [2] Transformers.Layers.SelfAttention(head::Int64, hidden_size::Int64)
   @ Transformers.Layers ~/.julia/packages/Transformers/nIgPX/src/layers/layer.jl:284
 [3] top-level scope
   @ REPL[73]:1
 [4] top-level scope
   @ ~/.julia/packages/CUDA/BbliS/src/initialization.jl:52

julia> Layers.CrossAttention(8, 512)
ERROR: UndefVarError: hidden_state not defined
Stacktrace:
 [1] Transformers.Layers.CrossAttention(head::Int64, hidden_size::Int64; dropout::Nothing, return_score::Bool)
   @ Transformers.Layers ~/.julia/packages/Transformers/nIgPX/src/layers/layer.jl:319
 [2] Transformers.Layers.CrossAttention(head::Int64, hidden_size::Int64)
   @ Transformers.Layers ~/.julia/packages/Transformers/nIgPX/src/layers/layer.jl:318
 [3] top-level scope
   @ REPL[74]:1
 [4] top-level scope
   @ ~/.julia/packages/CUDA/BbliS/src/initialization.jl:52
chengchingwen commented 1 year ago

Thanks!