Open jangsik-park opened 3 years ago
ㄱ. GD = gradient descent
ㄴ. Stochastic gradient decent(SGD)
ㄷ. Momentum
ㄹ. AdaGrad
ㅁ. rmsProp
ㅂ. Adam
케라스 optimizer 비교
1) Optimizer란 무엇일까요? (n411)
2) 여러 Optimizer 비교
[adam, sgd, sgd(with momentum), adagrad, rmsprop]
ㄱ. GD = gradient descent
ㄴ. Stochastic gradient decent(SGD)
ㄷ. Momentum
ㄹ. AdaGrad
ㅁ. rmsProp
ㅂ. Adam
출처
케라스 optimizer 비교