Closed matutenun closed 4 years ago
Seems like you just opened a few issues, so let's pin this down. I just copy pasted and ran the code just fine, so at least it's an installation issue. Are you on DiffEqFlux v1.10.1? Are you on Julia v1.4?
yes, thanks! I was using a previous version of DiffEqFlux. I updated everything, now the example does run. Note: It seems a minor issue, but I get this message/error below after finishing "scalar getindex is disallowed"
... loss: 0.223: 100%|█████████████████████████████████████████| Time: 0:04:52 0.22293958f0
scalar getindex is disallowed
Stacktrace: [1] error(::String) at .\error.jl:33 [2] assertscalar(::String) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:41 [3] getindex(::CuArray{Float32,1,Nothing}, ::Int64) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:96 [4] iterate at .\abstractarray.jl:913 [inlined] [5] iterate at .\abstractarray.jl:911 [inlined] [6] iterate at .\iterators.jl:641 [inlined] [7] iterate at .\iterators.jl:639 [inlined] [8] iterate at .\generator.jl:44 [inlined] [9] collect(::Base.Generator{Base.Iterators.Take{CuArray{Float32,1,Nothing}},Optim.var"#2#4"}) at .\array.jl:665 [10] show(::IOContext{Base.GenericIOBuffer{Array{UInt8,1}}}, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\Optim\UkDyx\src\types.jl:242 [11] show at .\multimedia.jl:47 [inlined] [12] limitstringmime(::MIME{Symbol("text/plain")}, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\IJulia\DrVMH\src\inline.jl:43 [13] display_mimestring(::MIME{Symbol("text/plain")}, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\IJulia\DrVMH\src\display.jl:67 [14] display_dict(::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\IJulia\DrVMH\src\display.jl:96 [15] #invokelatest#1 at .\essentials.jl:712 [inlined] [16] invokelatest at .\essentials.jl:711 [inlined] [17] execute_request(::ZMQ.Socket, ::IJulia.Msg) at C:\Users\matias.julia\packages\IJulia\DrVMH\src\execute_request.jl:112 [18] #invokelatest#1 at .\essentials.jl:712 [inlined] [19] invokelatest at .\essentials.jl:711 [inlined] [20] eventloop(::ZMQ.Socket) at C:\Users\matias.julia\packages\IJulia\DrVMH\src\eventloop.jl:8 [21] (::IJulia.var"#15#18")() at .\task.jl:358
which is the same error as before , but it does finish the convergence, as I can use the optimized parameters in result_neuralode.minimizer
Is that in a new REPL session? Can you share your status? i.e. ]st
? I'm curious what versions of the GPU packages you're on since I know on the latest ones it works (just tested)
it was from jupyterlab typing ]st
Status C:\Users\matias\.julia\environments\v1.4\Project.toml
[fbb218c0] BSON v0.2.6
[6e4b80f9] BenchmarkTools v0.5.0
[336ed68f] CSV v0.6.2
[3895d2a7] CUDAapi v4.0.0
[c5f51814] CUDAdrv v6.3.0
[be33ccc6] CUDAnative v3.1.0
[5ae59095] Colors v0.12.0
[3a865a2d] CuArrays v2.2.0
[2445eb08] DataDrivenDiffEq v0.3.1
[a93c6f00] DataFrames v0.21.0
[aae7a2af] DiffEqFlux v1.10.1
[41bf760c] DiffEqSensitivity v6.14.2
[0c46a032] DifferentialEquations v6.14.0
[b4f34e82] Distances v0.8.2
[31c24e10] Distributions v0.23.2
[ced4e74d] DistributionsAD v0.5.2
[5789e2e9] FileIO v1.3.0
[587475ba] Flux v0.10.4
[f6369f11] ForwardDiff v0.10.10
[0c68f7d7] GPUArrays v3.3.0
[7073ff75] IJulia v1.21.2
[6218d12a] ImageMagick v1.1.5
[4e3cecfd] ImageShow v0.2.3
[916415d5] Images v0.22.2
[c601a237] Interact v0.10.3
[18b7da76] JuliaAcademyData v0.1.0 #master (https://github.com/JuliaComputing/JuliaAcademyData.jl)
[e5e0dc1b] Juno v0.8.1
[961ee093] ModelingToolkit v3.3.0
[429524aa] Optim v0.20.6
[1dea7af3] OrdinaryDiffEq v5.38.1
[65888b18] ParameterizedFunctions v5.3.0
[58dd65bb] Plotly v0.3.0
[91a5bcdd] Plots v1.2.5
[d330b81b] PyPlot v2.9.0
[1fd47b50] QuadGK v2.3.1
[90137ffa] StaticArrays v0.12.3
[6fc51010] Surrogates v1.1.2
[6aa5eb33] TaylorSeries v0.10.3
[0f1e0344] WebIO v0.8.14
[e88e6eb3] Zygote v0.4.20
Could you completely restart and try again? I'm curious if Jupyter Lab is doing some kind of caching that blocks the update until you're in a completely new session.
from the repl I get something similar, after finishing the convergence,
Status: failure (reached maximum number of iterations)
Candidate solution Error showing value of type Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}: ERROR: scalar getindex is disallowed Stacktrace: [1] error(::String) at .\error.jl:33 [2] assertscalar(::String) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:41 [3] getindex(::CuArray{Float32,1,Nothing}, ::Int64) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:96 [4] iterate at .\abstractarray.jl:913 [inlined] [5] iterate at .\abstractarray.jl:911 [inlined] [6] iterate at .\iterators.jl:641 [inlined] [7] iterate at .\iterators.jl:639 [inlined] [8] iterate at .\generator.jl:44 [inlined] [9] collect(::Base.Generator{Base.Iterators.Take{CuArray{Float32,1,Nothing}},Optim.var"#2#4"}) at .\array.jl:665 [10] show(::IOContext{REPL.Terminals.TTYTerminal}, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\Optim\UkDyx\src\types.jl:242 [11] show(::IOContext{REPL.Terminals.TTYTerminal}, ::MIME{Symbol("text/plain")}, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at .\multimedia.jl:47 [12] display(::REPL.REPLDisplay, ::MIME{Symbol("text/plain")}, ::Any) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:137 [13] display(::REPL.REPLDisplay, ::Any) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:141 [14] display(::Any) at .\multimedia.jl:323 [15] (::Media.var"#15#16"{Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}})() at C:\Users\matias.julia\packages\Media\ItEPc\src\compat.jl:28 [16] hookless(::Media.var"#15#16"{Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}}) at C:\Users\matias.julia\packages\Media\ItEPc\src\compat.jl:14 [17] render(::Media.NoDisplay, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\Media\ItEPc\src\compat.jl:27 [18] render(::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\Media\ItEPc\src\system.jl:160 [19] display(::Media.DisplayHook, ::Optim.MultivariateOptimizationResults{ADAM,Float64,CuArray{Float32,1,Nothing},Float64,Float32,Nothing,Bool}) at C:\Users\matias.julia\packages\Media\ItEPc\src\compat.jl:9 [20] display(::Any) at .\multimedia.jl:323 [21] #invokelatest#1 at .\essentials.jl:712 [inlined] [22] invokelatest at .\essentials.jl:711 [inlined] [23] print_response(::IO, ::Any, ::Bool, ::Bool, ::Any) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:161 [24] print_response(::REPL.AbstractREPL, ::Any, ::Bool, ::Bool) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:146 [25] (::REPL.var"#do_respond#38"{Bool,REPL.var"#48#57"{REPL.LineEditREPL,REPL.REPLHistoryProvider},REPL.LineEditREPL,REPL.LineEdit.Prompt})(::Any, ::Any, ::Any) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:729 [26] #invokelatest#1 at .\essentials.jl:712 [inlined] [27] invokelatest at .\essentials.jl:711 [inlined] [28] run_interface(::REPL.Terminals.TextTerminal, ::REPL.LineEdit.ModalInterface, ::REPL.LineEdit.MIState) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\LineEdit.jl:2354 [29] run_frontend(::REPL.LineEditREPL, ::REPL.REPLBackendRef) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:1055 [30] run_repl(::REPL.AbstractREPL, ::Any) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\REPL\src\REPL.jl:206 [31] (::Base.var"#764#766"{Bool,Bool,Bool,Bool})(::Module) at .\client.jl:383 [32] #invokelatest#1 at .\essentials.jl:712 [inlined] [33] invokelatest at .\essentials.jl:711 [inlined] [34] run_main_repl(::Bool, ::Bool, ::Bool, ::Bool, ::Bool) at .\client.jl:367 [35] exec_options(::Base.JLOptions) at .\client.jl:305 [36] _start() at .\client.jl:484
julia> \\ ERROR: syntax: "\" is not a unary operator Stacktrace:
Okay cool, I expect Optim will have issue showing the result right now (though you can index it). We should make it not count maximum iterations as a failure though. That means it worked if it's an ML optimizer like from Flux.
indeed if i use BFGS it does not work, the error :
scalar getindex is disallowed
Stacktrace: [1] error(::String) at .\error.jl:33 [2] assertscalar(::String) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:41 [3] getindex(::CuArray{Float32,1,Nothing}, ::Int64) at C:\Users\matias.julia\packages\GPUArrays\OXvxB\src\host\indexing.jl:96 [4] iterate at .\abstractarray.jl:913 [inlined] [5] iterate at .\abstractarray.jl:911 [inlined] [6] generic_normInf at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\LinearAlgebra\src\generic.jl:445 [inlined] [7] normInf at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\LinearAlgebra\src\generic.jl:536 [inlined] [8] norm(::CuArray{Float32,1,Nothing}, ::Float64) at D:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.4\LinearAlgebra\src\generic.jl:611 [9] initial_state(::BFGS{LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Nothing,Float64,Flat}, ::Optim.Options{Float64,DiffEqFlux.var"#_cb#46"{var"#15#17",Base.Iterators.Cycle{Tuple{DiffEqFlux.NullData}}}}, ::TwiceDifferentiable{Float32,CuArray{Float32,1,Nothing},Array{Float32,2},CuArray{Float32,1,Nothing}}, ::CuArray{Float32,1,Nothing}) at C:\Users\matias.julia\packages\Optim\UkDyx\src\multivariate\solvers\first_order\bfgs.jl:74 [10] optimize(::TwiceDifferentiable{Float32,CuArray{Float32,1,Nothing},Array{Float32,2},CuArray{Float32,1,Nothing}}, ::CuArray{Float32,1,Nothing}, ::BFGS{LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Nothing,Float64,Flat}, ::Optim.Options{Float64,DiffEqFlux.var"#_cb#46"{var"#15#17",Base.Iterators.Cycle{Tuple{DiffEqFlux.NullData}}}}) at C:\Users\matias.julia\packages\Optim\UkDyx\src\multivariate\optimize\optimize.jl:33 [11] sciml_train(::Function, ::CuArray{Float32,1,Nothing}, ::BFGS{LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Nothing,Float64,Flat}, ::Base.Iterators.Cycle{Tuple{DiffEqFlux.NullData}}; cb::Function, maxiters::Int64, diffmode::DiffEqFlux.ZygoteDiffMode, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at C:\Users\matias.julia\packages\DiffEqFlux\9CzLb\src\train.jl:269
Yes that's a known upstream issue: https://github.com/SciML/DiffEqFlux.jl/issues/238
what i did noticed is that if i benchmark the case of using GPU vs CPU (same example) is two orders of magnitud faster with CPU
with GPU
samples: 1 evals/sample: 1
with CPU
samples: 1 evals/sample: 1
Yes, size 10/20 matrices are not good for GPUs. It's a demonstration but not really a tutorial that's great for GPUs. You want larger NNs for that.
In fact, you can probably get another order of magnitude faster by hyper optimizing for the CPU here with some StaticDense stuff.
cool, thanks for the an swers. aweome set of tools, I will keep playing with them and learn to use them.
For more questions, and just general chatter on the topics, feel free to join the Julia Slack: https://slackinvite.julialang.org/ . We're in the #diffeq-bridged and the #machine-learning channels.
I have been playing with your awesome tools, but cant make this one to work with GPU The example from https://diffeqflux.sciml.ai/dev/GPUs/#Neural-ODE-Example-1
when running this i get