SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

US spelling preferred? #729

Closed PallHaraldsson closed 3 months ago

PallHaraldsson commented 3 months ago

@Vaibhavdixit02 I just learned of (which is great seemingly, except the name): https://github.com/SciML/Optimization.jl/tree/master/lib/OptimizationOptimisers

It's relatively new, 2 months old (not 2 years, only for the LICENCE, think it doesn't apply).

We do have British spelling in the ecosystem: https://github.com/FluxML/Optimisers.jl

but it seems the worst of both worlds to have a mix of US and British as you do, so could it be renamed to OptimizationOptimizers, or are we stuck with the s? Or even rename to Sophia.jl if possible if that's the main or only one there?

I recently learned of Sophia and it seems great the future for all. Until even better made that I can't rule out... At least do you know if it's a good/fast Julia implementation, or some reason to use e.g. rather Python?

More generally, about renaming (sub)packages. is it worth it, and should we have a policy of registering US only from now on, or renaming to, or even allow both spellings...?

Vaibhavdixit02 commented 3 months ago

It's relatively new, 2 months old (not 2 years, only for the LICENCE, think it doesn't apply).

Nope it is 2 years old haha

but it seems the worst of both worlds to have a mix of US and British as you do, so could it be renamed to OptimizationOptimizers, or are we stuck with the s? Or even rename to Sophia.jl if possible if that's the main or only one there? More generally, about renaming (sub)packages. is it worth it, and should we have a policy of registering US only from now on, or renaming to, or even allow both spellings...?

The point is that it's a wrapper over the Optimisers.jl package you linked above, we follow the convention of Optimization* with the wrapped package's name being *.

I recently learned of Sophia and it seems great the future for all. Until even better made that I can't rule out... At least do you know if it's a good/fast Julia implementation, or some reason to use e.g. rather Python?

It isn't the best possible implementation but shouldn't be terrible either depending a lot on the AD backend. The hessian-vector product support here has some limitations but might improve in the near future.