Closed hegman12 closed 6 years ago
Hi @hegman12, thanks for your interest! You can think of it as gradient ascent, and may even find that many authors play a little loose with these two terms. (We may have been used to "descent" because tensorflow
's Optimizer
class's function for creating an optimization operation is called minimize
.)
Either way I feel you should not end up being confused due to our imprecision. @colah how do you feel about me just changing this particular instance to "gradient ascent"?
It seems like people in the deep learning community tend to frame optimization problems as a loss that you minimize, even when they're actually increasing things. So, descent seems a bit more conventional at this point. Additionally, discussion of optimization algorithms tends to use the descent terminology, so searching for that term is probably more helpful. Implementations, such as TensorFlow, also tend to focus on minimization.
That said, I don't have a strong view and would be happy to defer to others if someone feels strongly.
(I'm going to close this for now, but happy to reopen if anyone feels strongly.)
In the introductory section the article describes
My understanding is that such the optimization techniques use Gradient ascent algorithm to optimize the input image. Although, GD can be used with updating input in opposite direction to what GD normally does, but it is nothing but GA. Could you please clarify.
Thank you for making your research available openly. It is of immense help for my learning.