Open utkarsh530 opened 8 months ago
PSO methods generally need many more than 1000 iterations. Check how you do on that NN case first though, but in general I would be surprised if 1000 iterations is good enough.
Updated the example, seems better and faster than ADAM now:
julia> @time gsol = PSOGPU.parameter_estim_ode!(prob_nn,
gpu_particles,
gbest,
gpu_data,
lb,
ub; saveat = tsteps, dt = 0.1f0, prob_func = prob_func, maxiters = 100)
1.047137 seconds (26.52 k allocations: 1.267 MiB)
PSOGPU.PSOGBest{SVector{12, Float32}, Float32}(Float32[-3.0210462, 15.476199, -11.9110565, -5.3153186, 10.184191, 13.0345955, 6.292011, -4.4231596, 2.4189205, 5.8844023, -2.1717668, -0.31813943], 1.4513348f0)
#loss: 1.4513348f0
From the Neural ODE example:
The best solution that I was able to get was:
Compared to ADAM:
There's a lot of scope of improvement 😓![image](https://github.com/SciML/PSOGPU.jl/assets/37050056/3e275bcc-b330-482f-a740-f70478ecc671)
A decent-sized neural network should work (as shown in the docs in python PSO library), I think we should try to get this working for starters: https://pyswarms.readthedocs.io/en/latest/examples/usecases/train_neural_network.html
cc @ChrisRackauckas