chengchingwen / Transformers.jl

Julia Implementation of Transformer models
MIT License
526 stars 75 forks source link

Problem precompiling v0.1.14 on Julia 1.7.2 #90

Closed robertfeldt closed 2 years ago

robertfeldt commented 2 years ago

I can add without problems:

(@v1.7) pkg> add Transformers
    Updating registry at `~/.julia/registries/General.toml`
   Resolving package versions...
    Updating `~/.julia/environments/v1.7/Project.toml`
  [21ca0261] + Transformers v0.1.14
    Updating `~/.julia/environments/v1.7/Manifest.toml`
  [a4280ba5] + BytePairEncoding v0.2.0
  [bb354801] + Fetch v0.1.3
  [fbb45041] + Pickle v0.2.10
  [13d12f88] + PrimitiveOneHot v0.1.1
  [5e0ebb24] + Strided v1.2.2
  [21ca0261] + Transformers v0.1.14
  [9d95972d] + TupleTools v1.3.0

but then there is an ArgumentError on precompile:

julia> using Transformers
[ Info: Precompiling Transformers [21ca0261-441d-5938-ace7-c90938fde4d4]
WARNING: Method definition (::Type{Strided.StridedView{T, N, A, F} where F<:Union{typeof(Base.adjoint), typeof(Base.conj), typeof(Base.identity), typeof(Base.transpose)} where A<:(DenseArray{T, N} where N where T) where N where T})(Base.PermutedDimsArrays.PermutedDimsArray{T, N, perm, iperm, AA} where AA<:(AbstractArray{T, N} where N where T) where iperm) where {T, N, perm} in module Strided at /Users/feldt/.julia/packages/Strided/Af7gm/src/stridedview.jl:35 overwritten in module Torch at /Users/feldt/.julia/packages/Pickle/Ro6BR/src/torch/torch_save.jl:37.
  ** incremental compilation may be fatally broken for this module **

ERROR: LoadError: ArgumentError: Unsupported keyword argument 'config'
Stacktrace:
  [1] var"@cuda"(__source__::LineNumberNode, __module__::Module, ex::Vararg{Any})
    @ CUDA ~/.julia/packages/CUDA/fAEDi/src/compiler/execution.jl:47
  [2] include(mod::Module, _path::String)
    @ Base ./Base.jl:418
  [3] include(x::String)
    @ Transformers.HuggingFace ~/.julia/packages/Transformers/V363g/src/huggingface/HuggingFace.jl:1
...

This is on a MacBook so there is no CUDA. Maybe it doesn't make sense to use Transformers on this machine? Any input/advice welcome.

julia> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
  OS: macOS (arm64-apple-darwin21.2.0)
  CPU: Apple M1 Max
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-12.0.1 (ORCJIT, cyclone)
Environment:
  JULIA_NUM_THREADS = 8
chengchingwen commented 2 years ago

You would need Transformers.jl with version 0.1.15 at least. The CUDA package doesn't require cuda devices. That was an API change of their macro that cause this error, and it is fixed in 0.1.15 release.

robertfeldt commented 2 years ago

Thanks, things seem to work well after I installed Transformers#master instead. Thanks for the great package.