chengchingwen / Transformers.jl

Julia Implementation of Transformer models
MIT License
526 stars 75 forks source link

README example failing #131

Closed ViralBShah closed 1 year ago

ViralBShah commented 1 year ago

I'm using Julia 1.8 and just the released version of Transformers.jl:

julia> @assert reshape(decode(textencoder, sample.token), :) == [
           "[CLS]", "peter", "piper", "picked", "a", "peck", "of", "pick", "##led", "peppers", "[SEP]",
           "fuzzy", "wu", "##zzy",  "was", "a", "bear", "[SEP]"
       ]
ERROR: AssertionError: reshape(decode(textencoder, sample.token), :) == ["[CLS]", "peter", "piper", "picked", "a", "peck", "of", "pick", "##led", "peppers", "[SEP]", "fuzzy", "wu", "##zzy", "was", "a", "bear", "[SEP]"]
Stacktrace:
 [1] top-level scope
   @ REPL[11]:1
 [2] top-level scope
   @ ~/.julia/packages/CUDA/BbliS/src/initialization.jl:52
chengchingwen commented 1 year ago

Sorry, I pasted the result of a different model. The correct one should be hgf"bert-base-uncased"