Open lsoucasse opened 4 days ago
There are two solutions to your problem of needing to do so many steps for calculating a single track. One is to have a timestep that changes simply based on age and not any other criterion. In older rotation models that I ran, I would have the timestep increase at 10, 100, and 1000 Myr. The other option would be to use RKF or Rosenbrock without a dynamic timestep. These are not mutually exclusive either.
The way RKF and Rosenbrock calculate the timestep is by taking a desired error (difference between real value at the end of the step and achieved value at the end of the step) as a parameter and modifying the timestep to try to achieve that accuracy, making the timestep longer if a step is too accurate and repeating the step with a shorter timestep if it was not accurate enough. The reason this cannot be done in forward Euler and RK4 is that there is no good way to approximate the error since one does not know the real value at the end of the step. RKF and Rosenbrock do it by doing the timestep twice with more and less accurate versions of the method and using the difference as an approximation to the error. The more accurate method is then used to do the actual update.
What you can do is just take the more accurate method and use that with a fixed timestep. It would still have the desirable property that it can handle larger timesteps since it is anyway the method being used normally to do the numerical integration and all the other stuff is just about tuning the timestep.
I don't have any time to test it, but I think one can simply change the function _EvolveRotationStepRB (for Rosenbrock and something similar for RKF) to the function below the comment. All I did here was delete some lines to make it simpler. Also the docstring should be fixed I guess. You also probably don't need a deepcopy of X.
def _EvolveRotationStepRB(Mstar=None,Age=None,OmegaEnv=None,OmegaCore=None,dAgeMax=None,dAge=None,params=params.paramsDefault,StarEvo=None): """Takes basic stellar parameters, evolves by timestep using Runge-Kutta-Fehlberg method."""
# Get coefficients for solver
CoefficientsRB = params['CoefficientsRB']
# Setup array to hold integration values
X = np.array([OmegaEnv,OmegaCore])
nVar = len(X)
# Get Jacobian and assume it is a constant over timestep
Jac = _JacobianRB( Mstar , Age+dAge , X , nVar , params , StarEvo )
# Calculate the k coefficients
kCoeff = _kCoeffRB( Mstar , Age , dAge , X , Jac , nVar , CoefficientsRB , params , StarEvo )
# Get updated values
Xnew = copy.deepcopy(X)
for i in range(0,CoefficientsRB['s']):
Xnew += CoefficientsRB['b'][i] * kCoeff[:,i]
# Save new estimate
OmegaEnv = Xnew[0]
OmegaCore = Xnew[1]
return dAge , dAgeNew , OmegaEnv , OmegaCore
Thanks @ColinPhilipJohnstone for the input! I think I got the idea, but how would you fix the time grid such that it has much less point than the FE scheme? Would you use a constant increasing factor ('dAgeFactor = 1.5') with a capping to 'dAgeMax' or a fixed time step that you increase by a factor of 10 (?) at 10, 100, and 1000 Myr?
Either way is good I think. Just do some convergence testing to make sure your timestep is small enough that making it smaller does not change the result.
OK, sounds doable. I will make some tests to check accuracy and robustness. Thank you for the tips! Will let you know about the results.
We opted for the ForwardEuler scheme for its robustness (fixed timestepping) but it is rather slow as it uses a fine temporal grid (10,000 points).
An alternative would be to derive a fixed-timestepping version of the Rosenbrock scheme to benefit from both speed and robustness.