chengchingwen / Transformers.jl

Julia Implementation of Transformer models
MIT License
526 stars 75 forks source link

Can't compile package with Flux v0.14.23 #201

Open andreyz4k opened 1 week ago

andreyz4k commented 1 week ago

Flux recently changed the way it handles GPU backends, so there is no GPU_BACKEND variable anymore. Transformers.jl relies on it, so it crashes during precompilation.

PkgPrecompileError: The following 2 direct dependencies failed to precompile:

solver

Failed to precompile solver [5b87f58b-a6cb-43fc-b459-8e7b9fbb6b24] to "/Users/andreyz4k/.julia/compiled/v1.11/solver/jl_vAIjpw".
WARNING: could not import Flux.GPU_BACKEND into Transformers
ERROR: LoadError: UndefVarError: `GPU_BACKEND` not defined in `Transformers`
Stacktrace:
  [1] top-level scope
    @ none:1
  [2] eval(m::Module, e::Any)
    @ Core ./boot.jl:430
  [3] var"@static"(__source__::LineNumberNode, __module__::Module, ex::Any)
    @ Base ./osutils.jl:19
  [4] #macroexpand#74
    @ ./expr.jl:125 [inlined]
  [5] macroexpand
    @ ./expr.jl:123 [inlined]
  [6] docm(source::LineNumberNode, mod::Module, meta::Any, ex::Any, define::Bool)
    @ Base.Docs ./docs/Docs.jl:581
  [7] (::DocStringExtensions.var"#35#36"{typeof(DocStringExtensions.template_hook)})(::LineNumberNode, ::Vararg{Any})
    @ DocStringExtensions ~/.julia/packages/DocStringExtensions/JVu77/src/templates.jl:11
  [8] var"@doc"(::LineNumberNode, ::Module, ::String, ::Vararg{Any})
    @ Core ./boot.jl:646
  [9] include(mod::Module, _path::String)
    @ Base ./Base.jl:557
 [10] include(x::String)
    @ Transformers ~/.julia/packages/Transformers/qH1VW/src/Transformers.jl:1
 [11] top-level scope
    @ ~/.julia/packages/Transformers/qH1VW/src/Transformers.jl:15
 [12] include
    @ ./Base.jl:557 [inlined]
 [13] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::String)
    @ Base ./loading.jl:2806
 [14] top-level scope
    @ stdin:4
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/device.jl:7
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/device.jl:7
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/Transformers.jl:1
in expression starting at stdin:4
ERROR: LoadError: Failed to precompile Transformers [21ca0261-441d-5938-ace7-c90938fde4d4] to "/Users/andreyz4k/.julia/compiled/v1.11/Transformers/jl_CFD7w7".
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, keep_loaded_modules::Bool; flags::Cmd, cacheflags::Base.CacheFlags, reasons::Dict{String, Int64})
    @ Base ./loading.jl:3089
  [3] (::Base.var"#1081#1082"{Base.PkgId})()
    @ Base ./loading.jl:2477
  [4] mkpidlock(f::Base.var"#1081#1082"{Base.PkgId}, at::String, pid::Int32; kwopts::@Kwargs{stale_age::Int64, wait::Bool})
    @ FileWatching.Pidfile ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:95
  [5] #mkpidlock#6
    @ ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:90 [inlined]
  [6] trymkpidlock(::Function, ::Vararg{Any}; kwargs::@Kwargs{stale_age::Int64})
    @ FileWatching.Pidfile ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:116
  [7] #invokelatest#2
    @ ./essentials.jl:1056 [inlined]
  [8] invokelatest
    @ ./essentials.jl:1051 [inlined]
  [9] maybe_cachefile_lock(f::Base.var"#1081#1082"{Base.PkgId}, pkg::Base.PkgId, srcpath::String; stale_age::Int64)
    @ Base ./loading.jl:3613
 [10] maybe_cachefile_lock
    @ ./loading.jl:3610 [inlined]
 [11] _require(pkg::Base.PkgId, env::String)
    @ Base ./loading.jl:2473
 [12] __require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2300
 [13] #invoke_in_world#3
    @ ./essentials.jl:1088 [inlined]
 [14] invoke_in_world
    @ ./essentials.jl:1085 [inlined]
 [15] _require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2287
 [16] macro expansion
    @ ./loading.jl:2226 [inlined]
 [17] macro expansion
    @ ./lock.jl:273 [inlined]
 [18] __require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2183
 [19] #invoke_in_world#3
    @ ./essentials.jl:1088 [inlined]
 [20] invoke_in_world
    @ ./essentials.jl:1085 [inlined]
 [21] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2176
 [22] include(mod::Module, _path::String)
    @ Base ./Base.jl:557
 [23] include(x::String)
    @ solver ~/expcoder/src/solver.jl:1
 [24] top-level scope
    @ ~/expcoder/src/guiding_models/guiding_models.jl:5
 [25] include(mod::Module, _path::String)
    @ Base ./Base.jl:557
 [26] include(x::String)
    @ solver ~/expcoder/src/solver.jl:1
 [27] top-level scope
    @ ~/expcoder/src/solver.jl:12
 [28] include
    @ ./Base.jl:557 [inlined]
 [29] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::Nothing)
    @ Base ./loading.jl:2806
 [30] top-level scope
    @ stdin:4
in expression starting at /Users/andreyz4k/expcoder/src/guiding_models/nn_model.jl:5
in expression starting at /Users/andreyz4k/expcoder/src/guiding_models/guiding_models.jl:5
in expression starting at /Users/andreyz4k/expcoder/src/solver.jl:1
in expression starting at stdin:4
TransformersCUDAExt

Failed to precompile TransformersCUDAExt [c9e57b6e-d29f-5f36-92db-f84550281eca] to "/Users/andreyz4k/.julia/compiled/v1.11/TransformersCUDAExt/jl_GxBYj7".
WARNING: could not import Flux.GPU_BACKEND into Transformers
ERROR: LoadError: UndefVarError: `GPU_BACKEND` not defined in `Transformers`
Stacktrace:
  [1] top-level scope
    @ none:1
  [2] eval(m::Module, e::Any)
    @ Core ./boot.jl:430
  [3] var"@static"(__source__::LineNumberNode, __module__::Module, ex::Any)
    @ Base ./osutils.jl:19
  [4] #macroexpand#74
    @ ./expr.jl:125 [inlined]
  [5] macroexpand
    @ ./expr.jl:123 [inlined]
  [6] docm(source::LineNumberNode, mod::Module, meta::Any, ex::Any, define::Bool)
    @ Base.Docs ./docs/Docs.jl:581
  [7] (::DocStringExtensions.var"#35#36"{typeof(DocStringExtensions.template_hook)})(::LineNumberNode, ::Vararg{Any})
    @ DocStringExtensions ~/.julia/packages/DocStringExtensions/JVu77/src/templates.jl:11
  [8] var"@doc"(::LineNumberNode, ::Module, ::String, ::Vararg{Any})
    @ Core ./boot.jl:646
  [9] include(mod::Module, _path::String)
    @ Base ./Base.jl:557
 [10] include(x::String)
    @ Transformers ~/.julia/packages/Transformers/qH1VW/src/Transformers.jl:1
 [11] top-level scope
    @ ~/.julia/packages/Transformers/qH1VW/src/Transformers.jl:15
 [12] include
    @ ./Base.jl:557 [inlined]
 [13] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::String)
    @ Base ./loading.jl:2806
 [14] top-level scope
    @ stdin:4
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/device.jl:7
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/device.jl:7
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/src/Transformers.jl:1
in expression starting at stdin:4
ERROR: LoadError: Failed to precompile Transformers [21ca0261-441d-5938-ace7-c90938fde4d4] to "/Users/andreyz4k/.julia/compiled/v1.11/Transformers/jl_AcpnRX".
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, keep_loaded_modules::Bool; flags::Cmd, cacheflags::Base.CacheFlags, reasons::Dict{String, Int64})
    @ Base ./loading.jl:3089
  [3] (::Base.var"#1081#1082"{Base.PkgId})()
    @ Base ./loading.jl:2477
  [4] mkpidlock(f::Base.var"#1081#1082"{Base.PkgId}, at::String, pid::Int32; kwopts::@Kwargs{stale_age::Int64, wait::Bool})
    @ FileWatching.Pidfile ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:95
  [5] #mkpidlock#6
    @ ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:90 [inlined]
  [6] trymkpidlock(::Function, ::Vararg{Any}; kwargs::@Kwargs{stale_age::Int64})
    @ FileWatching.Pidfile ~/.julia/juliaup/julia-1.11.0+0.aarch64.apple.darwin14/share/julia/stdlib/v1.11/FileWatching/src/pidfile.jl:116
  [7] #invokelatest#2
    @ ./essentials.jl:1056 [inlined]
  [8] invokelatest
    @ ./essentials.jl:1051 [inlined]
  [9] maybe_cachefile_lock(f::Base.var"#1081#1082"{Base.PkgId}, pkg::Base.PkgId, srcpath::String; stale_age::Int64)
    @ Base ./loading.jl:3613
 [10] maybe_cachefile_lock
    @ ./loading.jl:3610 [inlined]
 [11] _require(pkg::Base.PkgId, env::String)
    @ Base ./loading.jl:2473
 [12] __require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2300
 [13] #invoke_in_world#3
    @ ./essentials.jl:1088 [inlined]
 [14] invoke_in_world
    @ ./essentials.jl:1085 [inlined]
 [15] _require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2287
 [16] macro expansion
    @ ./loading.jl:2226 [inlined]
 [17] macro expansion
    @ ./lock.jl:273 [inlined]
 [18] __require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2183
 [19] #invoke_in_world#3
    @ ./essentials.jl:1088 [inlined]
 [20] invoke_in_world
    @ ./essentials.jl:1085 [inlined]
 [21] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2176
 [22] include
    @ ./Base.jl:557 [inlined]
 [23] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::Nothing)
    @ Base ./loading.jl:2806
 [24] top-level scope
    @ stdin:4
in expression starting at /Users/andreyz4k/.julia/packages/Transformers/qH1VW/ext/TransformersCUDAExt/TransformersCUDAExt.jl:1
in expression starting at stdin:4
CarloLucibello commented 1 week ago

we will soon release a patch in Flux addressing the issue https://github.com/FluxML/Flux.jl/pull/2511

andreyz4k commented 1 week ago

@CarloLucibello the issue persists with 0.14.24 as well, because FluxCPUAdaptor and others were also deleted

CarloLucibello commented 1 week ago

FluxCPUAdaptor was an internal type, therefore Transfomers.jl shouldn't have relied on it. The proper solution here is for @chengchingwen or anyone else to fix Transformers.jl.

That said we could provide easily a deprecation path in Flux defining FluxCPUAdaptor = CPUDevice until that is done.

chengchingwen commented 1 week ago

@CarloLucibello Personally the device functionality changes should be considered as breaking. See also https://github.com/FluxML/Flux.jl/issues/2513

CarloLucibello commented 1 week ago

https://github.com/FluxML/Flux.jl/issues/2513 is a bug indeed, but the adaptors were never publicly exposed I think

chengchingwen commented 1 week ago

For the FluxAdaptor types, yes, but the whole device functionality in Flux would affect many aspects, not to mention MLDataDevice.jl is a different implementation that might introduce inconsistency, but that’s just my two cents.