This makes ODEModel a subclass of CallableNumericalModel, which magically enables using constraints with ODEModels.
In addition, to preserve existing behaviour, the finite_difference method is moved to CallableModel (everything we can call we can approximate the Jacobian for). This in turn means we now always generate a covariance matrix. Very shiny. The downside is I had to touch everything that made decisions on whether something had a jacobian/hessian. All hasattr(x, 'eval_jacobian/hessian') were changed to appropriate isinstance(x, (Gradient|Hessian)(Model|Objective) to avoid using the finite difference method during minimization, for which it's way too slow.
As a final nit, remove setting of bounds on fixed parameters, since this messes with minimizer selection and doesn't add value.
This makes ODEModel a subclass of CallableNumericalModel, which magically enables using constraints with ODEModels. In addition, to preserve existing behaviour, the finite_difference method is moved to CallableModel (everything we can call we can approximate the Jacobian for). This in turn means we now always generate a covariance matrix. Very shiny. The downside is I had to touch everything that made decisions on whether something had a jacobian/hessian. All
hasattr(x, 'eval_jacobian/hessian')
were changed to appropriateisinstance(x, (Gradient|Hessian)(Model|Objective)
to avoid using the finite difference method during minimization, for which it's way too slow. As a final nit, remove setting of bounds on fixed parameters, since this messes with minimizer selection and doesn't add value.Fixes #282