jangsik-park / Study

1 stars 0 forks source link

[Tenserflow]_[Optimizer] #8

Open jangsik-park opened 3 years ago

jangsik-park commented 3 years ago

1) Optimizer란 무엇일까요? (n411)

2) 여러 Optimizer 비교

[adam, sgd, sgd(with momentum), adagrad, rmsprop]

image

ㄱ. GD = gradient descent

ㄴ. Stochastic gradient decent(SGD)

ㄷ. Momentum

ㄹ. AdaGrad

ㅁ. rmsProp

ㅂ. Adam

출처

케라스 optimizer 비교