SciML / ReservoirComputing.jl

Reservoir computing utilities for scientific machine learning (SciML)
https://docs.sciml.ai/ReservoirComputing/stable/
MIT License
209 stars 38 forks source link

RecursiveArrayTools v3 compatibility #188

Closed njk7062 closed 10 months ago

njk7062 commented 10 months ago

I am receiving a MethodError when attempting to run the most basic example provided with ReservoirComputing.jl The MethodError occurs when invoking ESN:


julia> esn = ESN(input_data; reservoir = RandSparseReservoir(res_size, radius = 1.2, sparsity = 6 / res_size), input_layer = WeightedLayer(), nla_type = NLAT2()) ERROR: MethodError: no method matching size(::RecursiveArrayTools.DiffEqArray{Float64, 2, Vector{Vector{Float64}}, Vector{Float64}, Vector{Float64}, ODEFunction{true, SciMLBase.AutoSpecialize,

Full details including package and version numbers and terminal I/O is provided below. I look forward to your feedback and guidance in resolving this issue!

nick@localhost-live:~$ julia () | Documentation: https://docs.julialang.org () | () () | | |_ | Type "?" for help, "]?" for Pkg help. | | | | | | |/ ` | | | | || | | | (| | | Version 1.9.4 (2023-11-14) / |_'|||_'_| | Official https://julialang.org/ release |/ |

julia> versioninfo() Julia Version 1.9.4 Commit 8e5136fa297 (2023-11-14 08:46 UTC) Build Info: Official https://julialang.org/ release Platform Info: OS: Linux (x86_64-linux-gnu) CPU: 12 × AMD Ryzen 5 1600 Six-Core Processor WORD_SIZE: 64 LIBM: libopenlibm LLVM: libLLVM-14.0.6 (ORCJIT, znver1) Threads: 1 on 12 virtual cores

(@v1.9) pkg> status Status ~/.julia/environments/v1.9/Project.toml [1dea7af3] OrdinaryDiffEq v6.63.0 [91a5bcdd] Plots v1.39.0 [731186ca] RecursiveArrayTools v3.1.0 [7c2d2b1e] ReservoirComputing v0.9.5

julia> using ReservoirComputing, OrdinaryDiffEq

julia> #lorenz system parameters u0 = [1.0, 0.0, 0.0] 3-element Vector{Float64}: 1.0 0.0 0.0

julia> tspan = (0.0, 200.0) (0.0, 200.0)

julia> p = [10.0, 28.0, 8 / 3] 3-element Vector{Float64}: 10.0 28.0 2.6666666666666665

julia> #define lorenz system function lorenz(du, u, p, t) du[1] = p[1] (u[2] - u[1]) du[2] = u[1] (p[2] - u[3]) - u[2] du[3] = u[1] u[2] - p[3] u[3] end lorenz (generic function with 1 method)

julia> #solve and take data prob = ODEProblem(lorenz, u0, tspan, p) ODEProblem with uType Vector{Float64} and tType Float64. In-place: true timespan: (0.0, 200.0) u0: 3-element Vector{Float64}: 1.0 0.0 0.0

julia> data = solve(prob, ABM54(), dt = 0.02) retcode: Success Interpolation: 3rd order Hermite t: 10001-element Vector{Float64}: 0.0 0.02 0.04 0.06 0.08 0.1 0.12000000000000001 0.14 0.16 0.18 ⋮ 199.84000000002843 199.86000000002844 199.88000000002845 199.90000000002846 199.92000000002847 199.94000000002848 199.9600000000285 199.9800000000285 200.0 u: 10001-element Vector{Vector{Float64}}: [1.0, 0.0, 0.0] [0.8680136241599999, 0.5116114289171798, 0.00464843405046073] [0.846922029853721, 0.9721624694997062, 0.016723754482887952] [0.9127463182632621, 1.4365930707725014, 0.03638911365118479] [1.0547161536343206, 1.9492406388218464, 0.06682954770912672] [1.27153139719415, 2.5500972526141688, 0.11416945073599366] [1.569344101287243, 3.2792182194088135, 0.18865896722509617] [1.9607159493604909, 4.179979661504116, 0.30675850312612357] [2.4642887133184064, 5.301306601644064, 0.4946463038435349] [3.1048014427360577, 6.698665411076851, 0.7936722317742436] ⋮ [-12.712535028050192, -7.991920612082245, 36.96745201041986] [-11.636377668987166, -5.691148385936144, 36.66624998028994] [-10.372587430370148, -3.7766460900344607, 35.77093362370175] [-9.034083394619811, -2.325994687676812, 34.4844534202631] [-7.718171209563215, -1.3249744565922135, 32.98711594710444] [-6.496311609072913, -0.7054966719917979, 31.41147109034457] [-5.412475842259617, -0.3802159215457954, 29.84101721524591] [-4.486956988649644, -0.26529770021033905, 28.320947150047015] [-3.722463627032463, -0.2914952957436106, 26.87140831430775]

julia> shift = 300 300

julia> train_len = 5000 5000

julia> predict_len = 1250 1250

julia> #one step ahead for generative prediction input_data = data[:, shift:(shift + train_len - 1)] t: 5000-element Vector{Float64}: 5.9799999999999605 5.99999999999996 6.01999999999996 6.039999999999959 6.059999999999959 6.079999999999958 6.099999999999958 6.1199999999999575 6.139999999999957 6.159999999999957 ⋮ 105.79999999999609 105.81999999999609 105.83999999999608 105.85999999999608 105.87999999999607 105.89999999999607 105.91999999999607 105.93999999999606 105.95999999999606 u: 5000-element Vector{Vector{Float64}}: [-9.903289241214775, -9.018088588651105, 29.81517777140388] [-9.691839850827593, -8.473011482849701, 29.93605690750639] [-9.420486660590333, -7.939065463718113, 29.90843259211291] [-9.104880950241373, -7.4449797108850175, 29.742036297886404] [-8.762474089642671, -7.01314735768699, 29.454019305573695] [-8.410935717615619, -6.658702746055927, 29.06608913077009] [-8.066811666191718, -6.389828805905225, 28.601842075390245] [-7.744566849246501, -6.2089080639741825, 28.084693027483976] [-7.456053443994025, -6.11406609169808, 27.53654671977279] [-7.210360712173874, -6.1007246144663565, 26.977154921025395] ⋮ [12.518922134483173, 10.108387400259105, 34.677601864758] [11.905676310941601, 8.232629715112148, 35.05977814254804] [11.071914312382939, 6.472053614446601, 34.882263966926644] [10.092017839786998, 4.960342280071891, 34.24498656635422] [9.04494324433941, 3.7654618681658447, 33.275347750522876] [8.001895218860973, 2.8967891643506336, 32.09636572944009] [7.018825886840775, 2.3239810199099424, 30.808116692367136] [6.13395766702855, 1.9972030630363637, 29.48232612752881] [5.3690197189757916, 1.862732065250123, 28.165385874059083]

julia> target_data = data[:, (shift + 1):(shift + train_len)] t: 5000-element Vector{Float64}: 5.99999999999996 6.01999999999996 6.039999999999959 6.059999999999959 6.079999999999958 6.099999999999958 6.1199999999999575 6.139999999999957 6.159999999999957 6.179999999999956 ⋮ 105.81999999999609 105.83999999999608 105.85999999999608 105.87999999999607 105.89999999999607 105.91999999999607 105.93999999999606 105.95999999999606 105.97999999999605 u: 5000-element Vector{Vector{Float64}}: [-9.691839850827593, -8.473011482849701, 29.93605690750639] [-9.420486660590333, -7.939065463718113, 29.90843259211291] [-9.104880950241373, -7.4449797108850175, 29.742036297886404] [-8.762474089642671, -7.01314735768699, 29.454019305573695] [-8.410935717615619, -6.658702746055927, 29.06608913077009] [-8.066811666191718, -6.389828805905225, 28.601842075390245] [-7.744566849246501, -6.2089080639741825, 28.084693027483976] [-7.456053443994025, -6.11406609169808, 27.53654671977279] [-7.210360712173874, -6.1007246144663565, 26.977154921025395] [-7.013953305940451, -6.162911685030047, 26.423996356199307] ⋮ [11.905676310941601, 8.232629715112148, 35.05977814254804] [11.071914312382939, 6.472053614446601, 34.882263966926644] [10.092017839786998, 4.960342280071891, 34.24498656635422] [9.04494324433941, 3.7654618681658447, 33.275347750522876] [8.001895218860973, 2.8967891643506336, 32.09636572944009] [7.018825886840775, 2.3239810199099424, 30.808116692367136] [6.13395766702855, 1.9972030630363637, 29.48232612752881] [5.3690197189757916, 1.862732065250123, 28.165385874059083] [4.732497640190652, 1.8723908301054109, 26.88478061861816]

julia> test = data[:, (shift + train_len):(shift + train_len + predict_len - 1)]

t: 1250-element Vector{Float64}: 105.97999999999605 105.99999999999605 106.01999999999605 106.03999999999604 106.05999999999604 106.07999999999603 106.09999999999603 106.11999999999603 106.13999999999602 106.15999999999602 ⋮ 130.7999999999931 130.81999999999312 130.83999999999313 130.85999999999314 130.87999999999315 130.89999999999316 130.91999999999317 130.93999999999318 130.9599999999932 u: 1250-element Vector{Vector{Float64}}: [4.732497640190652, 1.8723908301054109, 26.88478061861816] [4.22349538680031, 1.987820915806937, 25.655613923905776] [3.8353316780542905, 2.181372263125947, 24.485767387745945] [3.5584604091713943, 2.4351927068305277, 23.37943997904429] [3.3826188986546626, 2.739600649501364, 22.339370937107052] [3.2982766454142727, 3.0913593199467937, 21.368185196603726] [3.2975239355256036, 3.492142131634464, 20.46924395963976] [3.3745467566607004, 3.947276445140908, 19.647275341036046] [3.5258121339502693, 4.464737468128744, 18.908963446114747] [3.7500536807441187, 5.054297346775937, 18.263604797307796] ⋮ [5.667284334594522, 7.920044709382794, 21.837292439416736] [6.128504560099492, 8.495628927065395, 21.64604924227314] [6.616442143819489, 9.132648149983781, 21.6160067677496] [7.1357622016914535, 9.812756425953655, 21.762471295222465] [7.686353276111834, 10.510880836564114, 22.099963839944905] [8.262833251743796, 11.193794316023716, 22.639394570288623] [8.853950777066231, 11.819492971445888, 23.384189012804384] [9.442118229459183, 12.338166583539232, 24.3255347922894] [10.003421378477109, 12.695609055900354, 25.43743234339266]

julia> res_size = 300 300

julia> esn = ESN(input_data; reservoir = RandSparseReservoir(res_size, radius = 1.2, sparsity = 6 / res_size), input_layer = WeightedLayer(), nla_type = NLAT2()) ERROR: MethodError: no method matching size(::RecursiveArrayTools.DiffEqArray{Float64, 2, Vector{Vector{Float64}}, Vector{Float64}, Vector{Float64}, ODEFunction{true, SciMLBase.AutoSpecialize, FunctionWrappersWrappers.FunctionWrappersWrapper{Tuple{FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{Float64}, Vector{Float64}, Vector{Float64}, Float64}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, Float64}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, Vector{Float64}, ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}}}, false}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, SymbolicIndexingInterface.SymbolCache{Nothing, Nothing, Nothing}}}, ::Int64)

Closest candidates are: size(::Union{LinearAlgebra.QR, LinearAlgebra.QRCompactWY, LinearAlgebra.QRPivoted}, ::Integer) @ LinearAlgebra ~/.julia/juliaup/julia-1.9.4+0.x64.linux.gnu/share/julia/stdlib/v1.9/LinearAlgebra/src/qr.jl:581 size(::Union{LinearAlgebra.QRCompactWYQ, LinearAlgebra.QRPackedQ}, ::Integer) @ LinearAlgebra ~/.julia/juliaup/julia-1.9.4+0.x64.linux.gnu/share/julia/stdlib/v1.9/LinearAlgebra/src/qr.jl:583 size(::Union{LinearAlgebra.Cholesky, LinearAlgebra.CholeskyPivoted}, ::Integer) @ LinearAlgebra ~/.julia/juliaup/julia-1.9.4+0.x64.linux.gnu/share/julia/stdlib/v1.9/LinearAlgebra/src/cholesky.jl:513 ...

Stacktrace: [1] ESN(train_data::RecursiveArrayTools.DiffEqArray{Float64, 2, Vector{Vector{Float64}}, Vector{Float64}, Vector{Float64}, ODEFunction{true, SciMLBase.AutoSpecialize, FunctionWrappersWrappers.FunctionWrappersWrapper{Tuple{FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{Float64}, Vector{Float64}, Vector{Float64}, Float64}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, Float64}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, Vector{Float64}, ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}}, FunctionWrappers.FunctionWrapper{Nothing, Tuple{Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}, Vector{Float64}, ForwardDiff.Dual{ForwardDiff.Tag{DiffEqBase.OrdinaryDiffEqTag, Float64}, Float64, 1}}}}, false}, LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, SymbolicIndexingInterface.SymbolCache{Nothing, Nothing, Nothing}}}; variation::Default, input_layer::WeightedLayer{Float64}, reservoir::RandSparseReservoir{Float64, Float64}, bias::NullLayer, reservoir_driver::RNN{typeof(NNlib.tanh_fast), Float64}, nla_type::NLAT2, states_type::StandardStates, washout::Int64, matrix_type::Type) @ ReservoirComputing ~/.julia/packages/ReservoirComputing/ptuHU/src/esn/echostatenetwork.jl:112 [2] top-level scope @ REPL[17]:1

julia>

ChrisRackauckas commented 10 months ago

Try ]add RecursiveArrayTools@2

MartinuzziFrancesco commented 10 months ago

Chris' suggestion is the solution and works on my machine (Julia 1.9.4, RC 0.9.5 and ODE 6.63 like you). If for any reason you can't add that package a workaround like data = reduce(hcat, data.u) while keeping everything else the same also works. Let us know if any of these help you

njk7062 commented 10 months ago

Thank you all for your prompt response! Everything now running perfectly.

ChrisRackauckas commented 10 months ago

@MartinuzziFrancesco let's yank all versions that were launched with v3 compatibility, downgrade, and make a new version. It sounds like we'll need to do a bit more to get v3 running in practice.

Reopening to catch this.

MartinuzziFrancesco commented 10 months ago

ok I'll get on it. I'm changing the title as well to highlight the issue better

This doesn't affect the internals though, it is just an issue over the examples and documentation I think. I will double check

ChrisRackauckas commented 10 months ago

@AayushSabharwal can you look into a solution here? I don't think it's easy to pull back from since this library didn't explicitly use RAT.

AayushSabharwal commented 10 months ago

The fix here is simple enough, but there's another error later when running the example.

Isn't this line redundant? Its output is not assigned anywhere

AayushSabharwal commented 10 months ago

https://github.com/SciML/RecursiveArrayTools.jl/pull/316

This along with removing the line mentioned above fixes this issue.

ChrisRackauckas commented 10 months ago

Master is fixed.