uwhpsc-2016 / homework1

Homework #1
1 stars 1 forks source link

Exercise 2: Handling the case where f'(x_0) = 0 but is not a local minimum #38

Open rachka opened 8 years ago

rachka commented 8 years ago

Do we need to improve our algorithm to get the true local minimum, or is it sufficient to just raise a value error saying that the x supplied in the argument of gradient descent yields an x value (x_0) that is not truly a local minimum but actually an inflection point with a derivative of 0? I guess I am asking what does "modify the function to account for this situation" in the problem statement for the automated tests specifically mean?

mvelegar commented 8 years ago

@rachka modifying your gradient descent function to account for this situation does mean "improve" your algorithm to get to the true local minimum. If the derivative evaluated at the input argument x is 0 (as you noted above) OR near 0, then could you think of a simple way to get out of the neighborhood of this guess x such that the algorithm may find a true minimum?

cswiercz commented 8 years ago

To add to @mvelegar 's response, the algorithm as written can get stuck on inflection points and even local maxima, as you had observed, if the point is given exactly. However, if there is just the teeniest bit of floating point error you can potentially move away from that non-minima critical point.

bmva commented 8 years ago

Does it matter which way we send the algorithm or will your test suite handle both? We could send it going to the left or right along x I believe if we have a local maximum.

cswiercz commented 8 years ago

We will test on a simple example where the initial guess is a local maxima and check if the result is one of the adjacent local minima.