Closed jackn11 closed 2 years ago
@chengchingwen the issue may be with this commit here: https://github.com/chengchingwen/Transformers.jl/commit/b1b1f9f54cbaa9a3e38318962ecaf0a2439f7db1#diff-09a5eabfb1fc24cdcb25cfa836f64c5387fb27458249ae1a2e48b9ace4a66ab4
Or perhaps that is the solution but it is not yet merged to main?
Note, I tested that commit and it allowed me to run using transformers
but then broke while training the bert model
I made a comment on your commit at the following link which yields the solution to the issue that should be implemented. Changes should be made on lines 43 and 46 of extend3d.jl. Please see https://github.com/chengchingwen/Transformers.jl/commit/b1b1f9f54cbaa9a3e38318962ecaf0a2439f7db1
You can also downgrade Flux to v0.13.3, the error is introduced by Flux v0.13.4.
Fixed in 0.1.18
I have tried several times deleting the transformers package in /packages, making a new virtual environment (
activate newenv
), adding CUDA and Transformers, but every time when I run using transformers, I get the following output.