Closed stochasticguy closed 10 months ago
Hello, sorry this email got buried.
It actually has a rather simple explanation: you're using the same seed for all solves 😅. Note that for an ensemble problem, a keyword argument is sent to all of the constituent solves. This is so that solve(enprob, alg; trajectories = N, abstol=1e-8)
is a nice way to set the tolerance for all solutions, among other solver characteristics. That behavior is documented on the page https://docs.sciml.ai/DiffEqDocs/stable/features/ensemble/, and so by setting seed
as a keyword argument in the solve
you're setting that seed for all of the solves and thus they are all using the same random numbers. If you plot the ensemble solution this is clear.
The way to do this correctly then is to use the prob_func
to say how specific problems should be different. For example:
function prob_func(prob, i, repeat)
remake(prob, seed = i)
end
would make the seed equal to i
on the ith trajectory. You may want to change that up a bit of course, taking random seeds etc., but that shows how to force it to be different per trajectory. Then you just do:
ens_pil = EnsembleProblem(sde_pil, prob_func = prob_func, output_func = (sol, i) -> (relu(sol, (K = K, τ = 1.0)), false))
and you see the convergence restored.
So in total, StochasticDiffEq.jl lets you set the seed, and if you set the same random seed for a whole Monte Carlo simulation you won't get the values you expect, and instead need to make sure different trajectories get different random numbers by setting a seed per problem rather than globally to all problems.
Hopefully that is clear.
Hi everyone,
First of all I wanted to congratulate you for this amazing package.
I'm having issues with the solution of a Stochastic Differential Equation (SDE) when passing the seed kwarg to the solve method.
When I don't pass the seed kwarg, the results I get are reasonable. On the other hand, when I pass the seed kwarg, the results are strange (i.e I get results that are either zero or far away from the expected result). I tried out different random number generators and the results are the same in the sense that are far from the expected ones.
I know that this problem is stochastic. Therefore, I don't expect the results to be the same when I use different seeds, rngs, etc. But I'm surprised to see how off they are.
Below there's an example.