Closed Domsall closed 3 months ago
LGTM! I'll do some testing and merge asap
I'm having a bit of trouble getting the nsde to work, I haven't tried the other one's yet. does
--action=calibrate ${workspaceFolder}/test_data/ ${workspaceFolder}/test_data/result/ nsde --population-size=200 --max-iter=10 --force-selection --param-keys=speedFactor,minGap,accel,decel,startupDelay,tau,delta,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,jerkmax,epsilonacc,taccmax,Mflatness,Mbegin --mop=distance,acceleration --gof=theils_u
work for you?
I think it's because of pymoo/core/problem.py:default_shape
, see line 453:
F=(n, problem.n_obj),
We need to return the rmse error in the vectorized target for each MOP, this change is not trivial.
So for using distance,acceleration
with pop size 200, we need a output shape of (200,2).
I will have to look on how we could rewrite measure_of_performance_factory to support that, but I think the simplest approach is to remove the scipy and pygad optimization options and rewrite the optimization factories
How do you feel about removing those and solely depending on pymoo(de)?
It works perfectly with NRMSE. I only used that GoF, because it performs best according to the literature.
Do you have an idea where to calculate the relative Error of each parameter combination und put the results into the csv-files? I would like to have the error of distance, speed and acceleration of each combinations to compare different GoFs and results.
@Domsall can you check if 3f787fe works for you? I changed the MOP factory so that mulitple measures of performance can be either summed or returned as tuple, the latter being used for pymoode & pymoo. Those should now work with theil's U and the others.
Do you mean you want not only the summed error (Error(distance)+Error(acceleration)), but rather separate entries for each measure of performance error per iteration, do I understand correctly? so instead of
,iteration,weightedError,convergence,speedFactor,minGap,accel,decel,emergencyDecel,startupDelay,tau,delta,stepping,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,sigmaleader,sigmagap,sigmaerror,jerkmax,epsilonacc,actionStepLength,taccmax,Mflatness,Mbegin,leader,follower,recordingId,algorithm,pop-size,objectives,paramKeys,weights,gof
0,1,0.0788671198053691,-1,1.2855707451145828,1.318680612079782,1.5195624232108904,2.0885433641765334,15,0.2711937168822218,1.7876845439698485,1.5757631151187173,0.25,7.620127425618868,5.557696634958655,13.005749369333241,0.23046687465248375,0.3215290425864744,0.0001,0.0001,0.0001,4.446657923678764,2.286665232349671,0.0001,1.4450888727958722,1.877283706894736,1.080154097499501,122,148,1,nsde,200,"distance,acceleration","speedFactor,minGap,accel,decel,startupDelay,tau,delta,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,jerkmax,epsilonacc,taccmax,Mflatness,Mbegin",,theils_u
you want somewhat this:
,iteration,weightedError,ERROR(distance),ERROR(acceleration),convergence,speedFactor,minGap,accel,decel,emergencyDecel,startupDelay,tau,delta,stepping,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,sigmaleader,sigmagap,sigmaerror,jerkmax,epsilonacc,actionStepLength,taccmax,Mflatness,Mbegin,leader,follower,recordingId,algorithm,pop-size,objectives,paramKeys,weights,gof
If not, please make an example csv header and result for the desired entries.
To your second bullet point: Yes, absolutely. And the error should be the relative Error.
The other changes: Looks great, should work like that. I will test and compare after you added the relative Error and merged everything.
To your second bullet point: Yes, absolutely. And the error should be the relative Error.
So for NRMSE with distance and acceleration as measures of performance, you mean relataive to their total error
rel_error_distance=(NRMSE(distance)/(NRMSE(distance) + NRMSE(accel))
?
Giving it more thought, I think the best idea is to simply always output the corresponding distance/speed/acceleration error of the chosen GOF next to the weightedError. That way any relative error can be calculated after the optimization.
EDIT: These errors are already in "all_results", so there may be no need to change anything
To use nsga2, we need to add pymoo or a newer pygad-solution.
I found a bug using the pygad-solution that I could only solve by changing the pygad code: https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/261
I also added many parameters to the command line for the possibility of changing the values without changing the code.
Solves #10