SciML / DiffEqParamEstim.jl

Easy scientific machine learning (SciML) parameter estimation with pre-built loss functions
https://docs.sciml.ai/DiffEqParamEstim/stable/
Other
60 stars 34 forks source link

re-establish build_lsoptim_objective() #221

Open j-fu opened 1 year ago

j-fu commented 1 year ago

Hi,

IMHO removing build_lsoptim_objective() breaks existing code unnecessarily. Moreover, this used to be in usage examples, and I did not find a clear upgrade recipe to the current version.

Vaibhavdixit02 commented 1 year ago

Since the change was made in a major release it shouldn't be breaking code, unless you updated the package purposefully of course. In which case if you need the functionality it should be fine to use the older version of the package?

j-fu commented 1 year ago

Sure. But the argument is - why remove this possibility when it just is implemented with one short function and does not even create a dependency or need an extension ? IMHO this just decreases composability with no gains or otherwise pins packages (and teaching examples etc) to old versions.

ChrisRackauckas commented 1 year ago

No gains? There's some very clear gains in terms of maintainability, composability, clarity, and documentation in that update. DiffEqParamEstim updated to re-target towards the Optimization.jl interface rather than having its own weird hybrid of AD support and abstract optimization library targeting. By doing so, it cut out a whole lot of code and centralized around what is now a well-documented optimizer interface. Thus all of the small objective functions were kicked out.

Why should a parameter estimation package have to bend to the API of every optimization library, especially when there is an optimization library that covers all of these interfaces? Instead of having one hacky workaround here that makes this work but for example doesn't make DiffEqFlux compatible with it, LeastSquaresOptim should be added to Optimization.jl's interface. That would:

That's a pretty major improvement and I don't think we want to compromise with a hacky single function that goes around the centralized APIs. As things evolve things have become more structured and DiffEqParamEstim was just the outlier that was behind in adopting the more general structures.

otherwise pins packages (and teaching examples etc) to old versions.

That's fine, manifests still work. But for everyone else, we very much hope they are teaching by using a single documented Optimization.jl package rather than an undocumented collogue of optimization packages with subtly different interfaces. I think clear documentation would serve students best.