distillpub / post--differentiable-parameterizations

A powerful, under-explored tool for neural network visualizations and art.
https://distill.pub/2018/differentiable-parameterizations
Creative Commons Attribution 4.0 International
26 stars 12 forks source link

Gradient Descent vs Gradient Ascent #85

Closed hegman12 closed 6 years ago

hegman12 commented 6 years ago

In the introductory section the article describes

This kind of optimization is possible because the networks are differentiable with respect to their inputs: we can slightly tweak the image to better fit the desired properties, and then iteratively apply such tweaks in gradient descent.

My understanding is that such the optimization techniques use Gradient ascent algorithm to optimize the input image. Although, GD can be used with updating input in opposite direction to what GD normally does, but it is nothing but GA. Could you please clarify.

Thank you for making your research available openly. It is of immense help for my learning.

ludwigschubert commented 6 years ago

Hi @hegman12, thanks for your interest! You can think of it as gradient ascent, and may even find that many authors play a little loose with these two terms. (We may have been used to "descent" because tensorflow's Optimizer class's function for creating an optimization operation is called minimize.)

Either way I feel you should not end up being confused due to our imprecision. @colah how do you feel about me just changing this particular instance to "gradient ascent"?

colah commented 6 years ago

It seems like people in the deep learning community tend to frame optimization problems as a loss that you minimize, even when they're actually increasing things. So, descent seems a bit more conventional at this point. Additionally, discussion of optimization algorithms tends to use the descent terminology, so searching for that term is probably more helpful. Implementations, such as TensorFlow, also tend to focus on minimization.

That said, I don't have a strong view and would be happy to defer to others if someone feels strongly.

colah commented 6 years ago

(I'm going to close this for now, but happy to reopen if anyone feels strongly.)