JuliaNLSolvers / LineSearches.jl

Line search methods for optimization and root-finding
Other
120 stars 34 forks source link

Documentation? #6

Open anriseth opened 8 years ago

anriseth commented 8 years ago

We should create documentation, or at the very least explain the input and output structure for the linesearch algorithms.

axsk commented 7 years ago

Maybe for starters just a basic example given a f, x0 and a search direction g?

anriseth commented 7 years ago

Maybe for starters just a basic example given a f, x0 and a search direction g?

That's a good idea. The examples can be copied from the test files. PRs are welcome ;)

pkofod commented 7 years ago

I should maybe add, @axsk, that while the functionality in here is certainly useful more generally, the package is a helper package for Optim and NLsolve, so some of the API might seem a bit alien if you're not familiar with the codebases in those packages.

simonbatzner commented 6 years ago

Any updates on this? It would be great to have documentation on how to use the package outside of Optim and NLsolve.

anriseth commented 6 years ago

We are in the middle of transitioning the code to a more "non-Optim-friendly" API. See #90

It should be quite straight-forward to introduce some examples for usage when we are finished with that.

EDIT: Both @pkofod and I are quite busy with other things this month, but I hope to have some time in June to sort out some missing JuliaNLSolvers things.

pkofod commented 6 years ago

I just want to chime in: it's actually the right approach to bump :) We're just super busy, and not many people have actually contributed to this package or have a clear idea where it's going, so you'll either have to help us out, or wait a bit longer. As Asbjørn said, we're both facing somewhat hard constraints and deadlines outside of OSS development, but nothing is abandoned.

longemen3000 commented 5 years ago

i want to contribute, if its possible, what can i do?

anriseth commented 5 years ago

Hi @longemen3000, thank you for the interest in helping out :)

NB: I have little knowledge of the current needs of the JuliaNLSolvers group, @pkofod can probably direct you more efficiently in the best direction

Are you particularly interested in contributing to the documentation, or more generally?

If documentation:

If any type of contribution:

longemen3000 commented 5 years ago

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

pkofod commented 5 years ago

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

Looking forward to seeing what you come up with, but keep in mind that someone has to read the diffs/change, so it's often much easier to get something merged if you stick to one specific change at a time. I'd rather have one for the converts, one for the quadratic roots, etc, than one giant PR that has many great changes but are difficult to isolate and review.

pkofod commented 5 years ago

i forked the package and i'm standarizing the code (changing convert(T,0) to zero(T), convert(T,1) to one(T), etc. the next step is standarizing the interpolation functions used in the package (quadratic,, etc), an trying to eliminate repeated code

Hi @longemen3000 If you're working on this, please reach out to me on slack, discourse or if you can find my e-mail. I want to make sure you're not doing anything that will be voided by near-future changes to all of JuliaNLSolvers.

baggepinnen commented 4 years ago

I'm trying to figure out how to set a maximum step length when optimizing with Optim, but it's quite hard. The keyword arguments to the different line-search methods do not seem to be documented, and poking around in the code I sometimes find αmax and sometimes alphamax etc. Example

@with_kw struct HagerZhang{T, Tm}
...
   alphamax::T = Inf
@with_kw struct InitialHagerZhang{T}
...
    αmax::T        = Inf

Should I set one of those?

anriseth commented 4 years ago

Hi @baggepinnen, is this to use with Optim or with a different package?

The BackTracking line search will only decrease the step size from the initial guess you pass it. If your optimization problem is properly scaled, you can use backtracking with a static initial step length = 1 (or some other step length that you determine to ensure you don't exceed your maximum step length). If you have to use HagerZhang, then set alphamax.

The initial step length functionality is a pre-process step, you can find out how it's used in the Optim source code. If you need to use the InitialHagerZhang procedure to decide the initial step length, then you should also set αmax.

Disclaimer: I haven't used this package / Julia for nearly two years.

pkofod commented 4 years ago

Should I set one of those?

There's no uniform interface across all options here I'm afraid, but yes I think alphamax is what you want. But to repeat Asbjørn's question, what are you doing more specifically?

Disclaimer: I haven't used this package / Julia for nearly two years.

But you're still faster than me on wrt help desk tasks. I like it :)

RossBoylan commented 1 year ago

My contribution to the documentation is to explain what I don't get.

It looks as if the central operation is res = (ls())(ϕ, dϕ, ϕdϕ, α0, ϕ0,dϕ0). I would expect there to be API documentation and, ideally, introductory material, explaining what each of the right-hand side arguments are, including the signatures and the semantics of those functions, and what res is. In a pinch it might refer to the Optim documentation, which I suppose is where the argument naming conventions and semantic expectations come from.

Among the semantic questions is whether the search is seeking to minimize or maximize the objective function (or is it looking for a zero?).

Some discussion of automatic differentiation would also be helpful; again, maybe a reference to Optim in a pinch.