Everything seems to work for me if I use the Adam optimizer, however when I try LBFGS I get output that is not by stylized. It appears that the optimizer stops after only one iteration, emitting various warnings (randomly it seems).
My machine has 64Gb Ram and a GTX 1080 Ti.
Here are a few outputs when I try LBFGS...any help would be greatly appreciated.
Machine precision = 2.220D-16
N = 874800 M = 10
This problem is unconstrained.
At X0 0 variables are exactly at the bounds
At iterate 0 f= 2.88465D+12 |proj g|= 1.09390D+07
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
***** 2 18 1 0 0 7.054D+06 2.124D+12
F = 2123653447680.00
CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH
Warning: more than 10 function and gradient
evaluations in the last line search. Termination
may possibly be caused by a bad search direction.
Cauchy time 0.000E+00 seconds.
Subspace minimization time 0.000E+00 seconds.
Line search time 0.000E+00 seconds.
Total User time 0.000E+00 seconds.
==========================================
Machine precision = 2.220D-16
N = 874800 M = 10
This problem is unconstrained.
At X0 0 variables are exactly at the bounds
At iterate 0 f= 2.88465D+12 |proj g|= 1.09390D+07
Nonpositive definiteness in Cholesky factorization in formt;
refresh the lbfgs memory and restart the iteration.
ys= 0.000E+00 -gs= 0.000E+00 BFGS update SKIPPED
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
***** 6 15 3 1 0 6.814D+06 2.102D+12
F = 2102054223872.00
CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH
Cauchy time 0.000E+00 seconds.
Subspace minimization time 0.000E+00 seconds.
Line search time 0.000E+00 seconds.
Total User time 0.000E+00 seconds.
Frame 1 elapsed time: 10.81877875328064
==========================================
Machine precision = 2.220D-16
N = 874800 M = 10
This problem is unconstrained.
At X0 0 variables are exactly at the bounds
At iterate 0 f= 2.88465D+12 |proj g|= 1.09390D+07
ascent direction in projection gd = 13099410.6106499
Bad direction in the line search;
refresh the lbfgs memory and restart the iteration.
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
***** 2 9 2 0 0 6.981D+06 2.122D+12
F = 2122045587456.00
CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH
Cauchy time 0.000E+00 seconds.
Subspace minimization time 0.000E+00 seconds.
Line search time 0.000E+00 seconds.
Total User time 0.000E+00 seconds.
Frame 1 elapsed time: 10.421173810958862
Many thanks for porting this to Tensorflow.
Everything seems to work for me if I use the Adam optimizer, however when I try LBFGS I get output that is not by stylized. It appears that the optimizer stops after only one iteration, emitting various warnings (randomly it seems).
My machine has 64Gb Ram and a GTX 1080 Ti.
Here are a few outputs when I try LBFGS...any help would be greatly appreciated.
==========================================
==========================================