chengchingwen / Transformers.jl

Julia Implementation of Transformer models
MIT License
526 stars 75 forks source link

Bug(?) while displaying a huggingface model #123

Closed atantos closed 1 year ago

atantos commented 1 year ago

Hi! Am trying to download and use a huggingface pre-trained model and I get the following error related to print_tree(). I suspect this is a bug.

julia> model = hgf"nlpaueb/bert-base-greek-uncased-v1"
(BertTextEncoder(
├─ TextTokenizer(MatchTokenization(WordPieceTokenization(bert_uncased_tokenizer, WordPiece(vocab_size=35000, unk=[UNK], max_char=200)), 5 patterns)),
├─ vocab = Vocab{String, SizedArray}(size = 35000, unk = [UNK], unki = 101),
├─ startsym = [CLS],
├─ endsym = [SEP],
├─ padsym = [PAD],
├─ trunc = 512,
└─ process = Pipelines:
  ╰─ target[tok] := TextEncodeBase.nestedcall(string_getvalue, source)
  ╰─ target[tok] := Transformers.Basic.grouping_sentence(target.tok)
  ╰─ target[tok_segment] := SequenceTemplate{String}([CLS]:<type=1> Input:<type=1> [SEP]:<type=1> (Input:<type=2> [SEP]:<type=2>)<type+=1>...)(target.tok)
  ╰─ target[tok] := TextEncodeBase.nestedcall(first, target.tok_segment)
  ╰─ target[segment] := TextEncodeBase.nestedcall(last, target.tok_segment)
  ╰─ target[trunc_tok] := TextEncodeBase.trunc_and_pad(512, [PAD], tail, tail)(target.tok)
  ╰─ target[trunc_len] := TextEncodeBase.nestedmaxlength(target.trunc_tok)
  ╰─ target[lpad] := false
  ╰─ target[mask] := Transformers.Basic.getmask(target.tok, target.trunc_len, target.lpad)
  ╰─ target[tok] := TextEncodeBase.nested2batch(target.trunc_tok)
  ╰─ target[segment] := TextEncodeBase.trunc_and_pad(512, 1, tail, tail)(target.segment)
  ╰─ target[segment] := TextEncodeBase.nested2batch(target.segment)
  ╰─ target[input] := (NamedTuple{(:tok, :segment)} ∘ tuple)(target.tok, target.segment)
  ╰─ target := (target.input, target.mask)
), Error showing value of type Tuple{Transformers.BidirectionalEncoder.BertTextEncoder{Transformers.Basic.TextTokenizer{TextEncodeBase.MatchTokenization{Transformers.BidirectionalEncoder.WordPieceTokenization{Transformers.BidirectionalEncoder.BertUnCasedPreTokenization}}}, TextEncodeBase.Vocab{String, StaticArraysCore.SizedVector{35000, String, Vector{String}}}, FuncPipelines.Pipelines{Tuple{FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{1, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(Transformers.Basic.string_getvalue)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, typeof(Transformers.Basic.grouping_sentence)}}}, FuncPipelines.Pipeline{:tok_segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, TextEncodeBase.SequenceTemplate{String, Tuple{TextEncodeBase.ConstTerm{String}, TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}, TextEncodeBase.RepeatedTerm{String, Tuple{TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}}}}}}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(first)}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(last)}}}}, FuncPipelines.Pipeline{:trunc_tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, String, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:trunc_len, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nestedmaxlength)}}}, FuncPipelines.PipeVar{:lpad, FuncPipelines.ApplyN{0, FuncPipelines.Identity{Bool}}}, FuncPipelines.Pipeline{:mask, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :trunc_len, :lpad), typeof(getmask)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, Int64, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:input, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :segment), ComposedFunction{Type{NamedTuple{(:tok, :segment)}}, typeof(tuple)}}}}, FuncPipelines.PipeGet{(:input, :mask)}}}}, Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}}:
ERROR: MethodError: no method matching print_tree(::typeof(Transformers.HuggingFace._printnode), ::IOContext{Base.TTY}, ::Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, ::Int64)
Closest candidates are:
  print_tree(::Function, ::IO, ::Any; maxdepth, indicate_truncation, charset, printkeys, depth, prefix) at ~/.julia/packages/AbstractTrees/x9S7q/src/printing.jl:188
  print_tree(::IO, ::Any; kw...) at ~/.julia/packages/AbstractTrees/x9S7q/src/printing.jl:269
  print_tree(::Any; kw...) at ~/.julia/packages/AbstractTrees/x9S7q/src/printing.jl:270
Stacktrace:
  [1] show(io::IOContext{Base.TTY}, x::Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}; depth::Int64)
    @ Transformers.HuggingFace ~/.julia/packages/Transformers/A1N7i/src/huggingface/models/base.jl:144
  [2] show(io::IOContext{Base.TTY}, x::Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}})
    @ Transformers.HuggingFace ~/.julia/packages/Transformers/A1N7i/src/huggingface/models/base.jl:143
  [3] show_delim_array(io::IOContext{Base.TTY}, itr::Tuple{Transformers.BidirectionalEncoder.BertTextEncoder{Transformers.Basic.TextTokenizer{TextEncodeBase.MatchTokenization{Transformers.BidirectionalEncoder.WordPieceTokenization{Transformers.BidirectionalEncoder.BertUnCasedPreTokenization}}}, TextEncodeBase.Vocab{String, StaticArraysCore.SizedVector{35000, String, Vector{String}}}, FuncPipelines.Pipelines{Tuple{FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{1, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(Transformers.Basic.string_getvalue)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, typeof(Transformers.Basic.grouping_sentence)}}}, FuncPipelines.Pipeline{:tok_segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, TextEncodeBase.SequenceTemplate{String, Tuple{TextEncodeBase.ConstTerm{String}, TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}, TextEncodeBase.RepeatedTerm{String, Tuple{TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}}}}}}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(first)}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(last)}}}}, FuncPipelines.Pipeline{:trunc_tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, String, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:trunc_len, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nestedmaxlength)}}}, FuncPipelines.PipeVar{:lpad, FuncPipelines.ApplyN{0, FuncPipelines.Identity{Bool}}}, FuncPipelines.Pipeline{:mask, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :trunc_len, :lpad), typeof(getmask)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, Int64, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:input, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :segment), ComposedFunction{Type{NamedTuple{(:tok, :segment)}}, typeof(tuple)}}}}, FuncPipelines.PipeGet{(:input, :mask)}}}}, Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}}, op::Char, delim::Char, cl::Char, delim_one::Bool, i1::Int64, n::Int64)
    @ Base ./show.jl:1244
  [4] show_delim_array
    @ ./show.jl:1229 [inlined]
  [5] show
    @ ./show.jl:1262 [inlined]
  [6] show(io::IOContext{Base.TTY}, #unused#::MIME{Symbol("text/plain")}, x::Tuple{Transformers.BidirectionalEncoder.BertTextEncoder{Transformers.Basic.TextTokenizer{TextEncodeBase.MatchTokenization{Transformers.BidirectionalEncoder.WordPieceTokenization{Transformers.BidirectionalEncoder.BertUnCasedPreTokenization}}}, TextEncodeBase.Vocab{String, StaticArraysCore.SizedVector{35000, String, Vector{String}}}, FuncPipelines.Pipelines{Tuple{FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{1, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(Transformers.Basic.string_getvalue)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, typeof(Transformers.Basic.grouping_sentence)}}}, FuncPipelines.Pipeline{:tok_segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, TextEncodeBase.SequenceTemplate{String, Tuple{TextEncodeBase.ConstTerm{String}, TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}, TextEncodeBase.RepeatedTerm{String, Tuple{TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}}}}}}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(first)}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(last)}}}}, FuncPipelines.Pipeline{:trunc_tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, String, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:trunc_len, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nestedmaxlength)}}}, FuncPipelines.PipeVar{:lpad, FuncPipelines.ApplyN{0, FuncPipelines.Identity{Bool}}}, FuncPipelines.Pipeline{:mask, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :trunc_len, :lpad), typeof(getmask)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, Int64, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:input, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :segment), ComposedFunction{Type{NamedTuple{(:tok, :segment)}}, typeof(tuple)}}}}, FuncPipelines.PipeGet{(:input, :mask)}}}}, Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}})
    @ Base.Multimedia ./multimedia.jl:47
  [7] display(d::REPL.REPLDisplay{REPL.LineEditREPL}, mime::MIME{Symbol("text/plain")}, x::Tuple{Transformers.BidirectionalEncoder.BertTextEncoder{Transformers.Basic.TextTokenizer{TextEncodeBase.MatchTokenization{Transformers.BidirectionalEncoder.WordPieceTokenization{Transformers.BidirectionalEncoder.BertUnCasedPreTokenization}}}, TextEncodeBase.Vocab{String, StaticArraysCore.SizedVector{35000, String, Vector{String}}}, FuncPipelines.Pipelines{Tuple{FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{1, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(Transformers.Basic.string_getvalue)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, typeof(Transformers.Basic.grouping_sentence)}}}, FuncPipelines.Pipeline{:tok_segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, TextEncodeBase.SequenceTemplate{String, Tuple{TextEncodeBase.ConstTerm{String}, TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}, TextEncodeBase.RepeatedTerm{String, Tuple{TextEncodeBase.InputTerm{String}, TextEncodeBase.ConstTerm{String}}}}}}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(first)}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok_segment, Base.Fix1{typeof(TextEncodeBase.nestedcall), typeof(last)}}}}, FuncPipelines.Pipeline{:trunc_tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:tok, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, String, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:trunc_len, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nestedmaxlength)}}}, FuncPipelines.PipeVar{:lpad, FuncPipelines.ApplyN{0, FuncPipelines.Identity{Bool}}}, FuncPipelines.Pipeline{:mask, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :trunc_len, :lpad), typeof(getmask)}}}, FuncPipelines.Pipeline{:tok, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:trunc_tok, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, FuncPipelines.FixRest{typeof(TextEncodeBase.trunc_and_pad), Tuple{Int64, Int64, Symbol, Symbol}}}}}, FuncPipelines.Pipeline{:segment, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{:segment, typeof(TextEncodeBase.nested2batch)}}}, FuncPipelines.Pipeline{:input, FuncPipelines.ApplyN{2, FuncPipelines.ApplySyms{(:tok, :segment), ComposedFunction{Type{NamedTuple{(:tok, :segment)}}, typeof(tuple)}}}}, FuncPipelines.PipeGet{(:input, :mask)}}}}, Transformers.HuggingFace.HGFBertModel{Transformers.HuggingFace.HGFBertEmbeddings{Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHEmbedding{Matrix{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}, Transformers.HuggingFace.HGFBertEncoder{12, Transformers.HuggingFace.FakeTHModuleList{12, NTuple{12, Transformers.HuggingFace.HGFBertLayer{Nothing, Transformers.HuggingFace.HGFBertAttention{Transformers.HuggingFace.HGFBertSelfAttention{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertSelfOutput{Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}, Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}, Transformers.HuggingFace.HGFBertIntermediate{typeof(NNlib.gelu), Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}, Transformers.HuggingFace.HGFBertOutput{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}, Transformers.HuggingFace.FakeTHLayerNorm{Vector{Float32}}}}}}}, Transformers.HuggingFace.HGFBertPooler{Transformers.HuggingFace.FakeTHLinear{Matrix{Float32}, Vector{Float32}}}}})
    @ OhMyREPL ~/.julia/packages/OhMyREPL/oDZvT/src/output_prompt_overwrite.jl:8
  [8] display(d::REPL.REPLDisplay, x::Any)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:272
  [9] display(x::Any)
    @ Base.Multimedia ./multimedia.jl:328
 [10] #invokelatest#2
    @ ./essentials.jl:729 [inlined]
 [11] invokelatest
    @ ./essentials.jl:726 [inlined]
 [12] print_response(errio::IO, response::Any, show_value::Bool, have_color::Bool, specialdisplay::Union{Nothing, AbstractDisplay})
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:296
 [13] (::REPL.var"#45#46"{REPL.LineEditREPL, Pair{Any, Bool}, Bool, Bool})(io::Any)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:278
 [14] with_repl_linfo(f::Any, repl::REPL.LineEditREPL)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:521
 [15] print_response(repl::REPL.AbstractREPL, response::Any, show_value::Bool, have_color::Bool)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:276
 [16] (::REPL.var"#do_respond#66"{Bool, Bool, REPL.var"#77#87"{REPL.LineEditREPL, REPL.REPLHistoryProvider}, REPL.LineEditREPL, REPL.LineEdit.Prompt})(s::REPL.LineEdit.MIState, buf::Any, ok::Bool)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:857
 [17] (::VSCodeServer.var"#98#101"{REPL.var"#do_respond#66"{Bool, Bool, REPL.var"#77#87"{REPL.LineEditREPL, REPL.REPLHistoryProvider}, REPL.LineEditREPL, REPL.LineEdit.Prompt}})(mi::REPL.LineEdit.MIState, buf::IOBuffer, ok::Bool)
    @ VSCodeServer ~/.vscode/extensions/julialang.language-julia-1.38.2/scripts/packages/VSCodeServer/src/repl.jl:122
 [18] #invokelatest#2
    @ ./essentials.jl:729 [inlined]
 [19] invokelatest
    @ ./essentials.jl:726 [inlined]
 [20] run_interface(terminal::REPL.Terminals.TextTerminal, m::REPL.LineEdit.ModalInterface, s::REPL.LineEdit.MIState)
    @ REPL.LineEdit /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/LineEdit.jl:2510
 [21] run_frontend(repl::REPL.LineEditREPL, backend::REPL.REPLBackendRef)
    @ REPL /Applications/Julia-1.8.app/Contents/Resources/julia/share/julia/stdlib/v1.8/REPL/src/REPL.jl:1248
 [22] (::REPL.var"#49#54"{REPL.LineEditREPL, REPL.REPLBackendRef})()
    @ REPL ./task.jl:484
chengchingwen commented 1 year ago

This is an error of displaying the model. The downloading is finishing correctly, so you could still use the model.

However, I cannot reproduce the error, what is the version of julia, Transformers.jl, and AbstractTrees.jl? (you can find it by typing ]st -m in the REPL)

atantos commented 1 year ago

Thanks for the quick response!

Julia: 1.8.0 Transformers.jl: 0.1.25 AbstractTrees.JL: 0.4.3

I just started experimenting with the huggingface hub through Transformers.jl and the error message gave the impression that the model was not there. As a suggestion, probably, a more informative error message might help a different new user of the package.

Thanks, again.

chengchingwen commented 1 year ago

Ok, looks like AbstractTrees 0.4 is too new for Transformers. I'll need to update the display functions.

If you want to quickly bypass the issue, you could downgrade AbstractTrees to 0.3. (or just ignore the error since it's just a bug of displaying).

chengchingwen commented 1 year ago

Should be fixed on master branch

atantos commented 1 year ago

Ok, looks like AbstractTrees 0.4 is too new for Transformers. I'll need to update the display functions.

If you want to quickly bypass the issue, you could downgrade AbstractTrees to 0.3. (or just ignore the error since it's just a bug of displaying).

Great!