Open andreyz4k opened 1 week ago
we will soon release a patch in Flux addressing the issue https://github.com/FluxML/Flux.jl/pull/2511
@CarloLucibello the issue persists with 0.14.24 as well, because FluxCPUAdaptor
and others were also deleted
FluxCPUAdaptor
was an internal type, therefore Transfomers.jl shouldn't have relied on it. The proper solution here is for @chengchingwen or anyone else to fix Transformers.jl.
That said we could provide easily a deprecation path in Flux defining FluxCPUAdaptor = CPUDevice
until that is done.
@CarloLucibello Personally the device functionality changes should be considered as breaking. See also https://github.com/FluxML/Flux.jl/issues/2513
https://github.com/FluxML/Flux.jl/issues/2513 is a bug indeed, but the adaptors were never publicly exposed I think
For the FluxAdaptor types, yes, but the whole device functionality in Flux would affect many aspects, not to mention MLDataDevice.jl is a different implementation that might introduce inconsistency, but that’s just my two cents.
Flux recently changed the way it handles GPU backends, so there is no GPU_BACKEND variable anymore. Transformers.jl relies on it, so it crashes during precompilation.