josejimenezluna / pyGPGO

Bayesian optimization for Python
http://pygpgo.readthedocs.io
MIT License
241 stars 61 forks source link

leading minor of the array is not positive definite #21

Closed xiaohongniua closed 5 years ago

xiaohongniua commented 5 years ago

numpy.linalg.linalg.LinAlgError: 9-th leading minor of the array is not positive definite I used the same code yesterday,it worked well,but the bug appears after I change the params boundry old params param = { 'layer_1_size': ('int', [992, 1056]), 'layer_2_size': ('int', [992, 1056]), 'p_dropout': ('cont', [0.30, 0.70]), 'learning_rate': ('cont', [0.200, 0.400]), 'weight_decay': ('cont', [0.001, 0.010]),

     }

new param param = { 'layer_1_size': ('int', [1023, 1025]), 'layer_2_size': ('int', [1023, 1025]), 'p_dropout': ('cont', [0.45, 0.55]), 'learning_rate': ('cont', [0.250, 0.350]), 'weight_decay': ('cont', [0.001, 0.003]),

     }
josejimenezluna commented 5 years ago

Hello @xiaohongniua,

One thing you can do is to use the logs of both the learning_rate and weight_decay when running the code. It should make things more numerically stable.

xiaohongniua commented 5 years ago

Hello @xiaohongniua,

One thing you can do is to use the logs of both the learning_rate and weight_decay when running the code. It should make things more numerically stable.

Thanks for your reply. like this? optimizer=optim.SGD(learning_rate=10^-x,weight_decay=10^-y) param = { 'layer_1_size': ('int', [1023, 1025]), 'layer_2_size': ('int', [1023, 1025]), 'p_dropout': ('cont', [0.45, 0.55]), 'learning_rate': ('cont', [2, 3]), 'weight_decay': ('cont', [3, 4]), }

josejimenezluna commented 5 years ago

Yes, let me know if you encounter any other issues.

xiaohongniua commented 5 years ago

Yes, let me know if you encounter any other issues.

many thanks