If I Google UndefVarError: #flatten not defined, I get in the release notes of the Flux.jl package that its recent version 0.13.14 fixed this issue: https://github.com/FluxML/Flux.jl/releases.
However, this version seems to be incompatible with AlphaZero.jl. Indeed, if I modify the file Project.toml in my customized version of AlphaZero.jl in order to update the Flux package to the 0.13.14 version, I get the following error:
ERROR: Unsatisfiable requirements detected for package cuDNN [02a925ec]:
cuDNN [02a925ec] log:
├─possible versions are: 1.0.0-1.0.2 or uninstalled
└─restricted by compatibility requirements with CUDA [052768ef] to versions: uninstalled — no versions left
└─CUDA [052768ef] log:
├─possible versions are: 0.1.0-4.2.0 or uninstalled
└─restricted to versions 3 by AlphaZero [7a1cc850], leaving only versions 3.0.0-3.13.1
└─AlphaZero [7a1cc850] log:
├─possible versions are: 0.5.4 or uninstalled
└─AlphaZero [7a1cc850] is fixed to version 0.5.4
Maybe because that version of Flux requires a version of CUDA which is incompatible with AlphaZero restrictions?
I think I messed with my installation of AlphaZero, and I can't recollect exactly how this occurred.
When launching a training, I get the following error:
If I Google
UndefVarError: #flatten not defined
, I get in the release notes of theFlux.jl
package that its recent version 0.13.14 fixed this issue: https://github.com/FluxML/Flux.jl/releases.However, this version seems to be incompatible with AlphaZero.jl. Indeed, if I modify the file
Project.toml
in my customized version ofAlphaZero.jl
in order to update the Flux package to the 0.13.14 version, I get the following error:Maybe because that version of Flux requires a version of CUDA which is incompatible with AlphaZero restrictions?
I don't know how to move forward.