Open oqrusk opened 6 years ago
CostFunctionの戻り値がJ, gradのtupleだと不都合な理由はfminuncとoptimize.minimizeの仕様の違い
GradObj When set to "on", the function to be minimized must return a second argument which is the gradient, or first derivative, of the function at the point x. If set to "off" [default], the gradient is computed via finite differences.
optimize.minimize http://www.kamishima.net/mlmpyja/lr/optimization.html#id2
BFGSアルゴリズムを使う場合、optimizeの引数にjac=勾配ベクトルが必要
jac=勾配ベクトル
jac は目的関数 fun の勾配ベクトル(ヤコビベクトル)すなわち,目的関数を入力パラメータの各変数での1次導関数を要素とするベクトル返す関数を与えます
これがgrad
CostFunctionの戻り値がJ, gradのtupleだと不都合な理由はfminuncとoptimize.minimizeの仕様の違い
GradObj When set to "on", the function to be minimized must return a second argument which is the gradient, or first derivative, of the function at the point x. If set to "off" [default], the gradient is computed via finite differences.