bayesian-optimization / BayesianOptimization

A Python implementation of global optimization with gaussian processes.
https://bayesian-optimization.github.io/BayesianOptimization/index.html
MIT License
7.97k stars 1.55k forks source link

why the same point were generated by optimizer.suggest() with different value of kappa #404

Closed Kertin closed 1 year ago

Kertin commented 1 year ago

Hi guys, I followed the section 1, "Suggest-Evaluate-Register Paradigm", in "Advanced tour of the Bayesian Optimization package", which is https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb, and as I expected, I got the different suggested point in case 1(see below). While in case 2(see below), I loaded my data file, the suggested points I got were always the same. Why the behavior in these two cases is different? Thanks for your help! Here is my code:

from bayes_opt import BayesianOptimization
import numpy as np
import sys

data=np.loadtxt(sys.argv[1], dtype=float,delimiter=' ')
c1_arr=data[:,0]
c2_arr=data[:,1]
c3_arr=data[:,2]
c4_arr=data[:,3]
c5_arr=data[:,4]
m_arr=data[:,5]

x=[-2.0,-1.0,0]
y=[-3.0,-1.0,0]
t=[-19.0,-4.0,0.0]

optimizer = BayesianOptimization(
    f=None,
    pbounds={'x': (-2, 2), 'y': (-3, 3)}, # uncomment in case 1, comment in case 2
    #pbounds={'c1':(0,1),'c2':(0,1),'c3':(0,1),'c4':(0,1)}, # uncomment in case 2, comment in case 1
    verbose=2,
    random_state=1,
)

from bayes_opt import UtilityFunction

utility = UtilityFunction(kind="ucb", kappa=float(sys.argv[2]), xi=0.0)
next_point = optimizer.suggest(utility)
print(next_point)
for _ in range(3): 
    next_point = {'x':x[_],'y':y[_]} # uncomment in case 1, comment in case 2
    target = t[_] # uncomment in case 1, comment in case 2
    #next_point={'c1':c1_arr[_],'c2':c2_arr[_],'c3':c3_arr[_],'c4':c4_arr[_]} # uncomment in case 2, comment in case 1
    #target = m_arr[_] # uncomment in case 2, comment in case 1
    optimizer.register(params=next_point, target=target)
    print(target, next_point)
next_point = optimizer.suggest(utility)
print(next_point)

Here are the two cases: case 1: command 1:
python3 tmp.py ucb.data 1 output 1: {'x': -0.331911981189704, 'y': 1.3219469606529488} -19.0 {'x': -2.0, 'y': -3.0} -4.0 {'x': -1.0, 'y': -1.0} 0.0 {'x': 0, 'y': 0} {'x': -0.9132099073572701, 'y': 0.8401605121148267}

command 2:
python3 tmp.py ucb.data 5 output 2: {'x': -0.331911981189704, 'y': 1.3219469606529488} -19.0 {'x': -2.0, 'y': -3.0} -4.0 {'x': -1.0, 'y': -1.0} 0.0 {'x': 0, 'y': 0} {'x': -1.8706715477532216, 'y': 2.369551303030108}

command 3:
python3 tmp.py ucb.data 10 output 3: {'x': -0.331911981189704, 'y': 1.3219469606529488} -19.0 {'x': -2.0, 'y': -3.0} -4.0 {'x': -1.0, 'y': -1.0} 0.0 {'x': 0, 'y': 0} {'x': -2.0, 'y': 3.0}

case 2: command 1: python3 tmp.py ucb.data 1 output 1: {'c1': 0.417022004702574, 'c2': 0.7203244934421581, 'c3': 0.00011437481734488664, 'c4': 0.30233257263183977} 19.240423468414097 {'c1': 0.0479718, 'c2': 0.0703287, 'c3': 0.220258, 'c4': 0.626224} 20.49860803932102 {'c1': 0.064742, 'c2': 0.0902062, 'c3': 0.37397, 'c4': 0.255238} 20.864252498791814 {'c1': 0.0618858, 'c2': 0.160758, 'c3': 0.247392, 'c4': 0.425238} {'c1': 0.00771985598412106, 'c2': 0.1618609342126962, 'c3': 0.7093690341699984, 'c4': 0.7647368589916009}

command 2:
python3 tmp.py ucb.data 5 output 2: {'c1': 0.417022004702574, 'c2': 0.7203244934421581, 'c3': 0.00011437481734488664, 'c4': 0.30233257263183977} 19.240423468414097 {'c1': 0.0479718, 'c2': 0.0703287, 'c3': 0.220258, 'c4': 0.626224} 20.49860803932102 {'c1': 0.064742, 'c2': 0.0902062, 'c3': 0.37397, 'c4': 0.255238} 20.864252498791814 {'c1': 0.0618858, 'c2': 0.160758, 'c3': 0.247392, 'c4': 0.425238} {'c1': 0.00771985598412106, 'c2': 0.1618609342126962, 'c3': 0.7093690341699984, 'c4': 0.7647368589916009}

command 3:
python3 tmp.py ucb.data 10 output 3: {'c1': 0.417022004702574, 'c2': 0.7203244934421581, 'c3': 0.00011437481734488664, 'c4': 0.30233257263183977} 19.240423468414097 {'c1': 0.0479718, 'c2': 0.0703287, 'c3': 0.220258, 'c4': 0.626224} 20.49860803932102 {'c1': 0.064742, 'c2': 0.0902062, 'c3': 0.37397, 'c4': 0.255238} 20.864252498791814 {'c1': 0.0618858, 'c2': 0.160758, 'c3': 0.247392, 'c4': 0.425238} {'c1': 0.00771985598412106, 'c2': 0.1618609342126962, 'c3': 0.7093690341699984, 'c4': 0.7647368589916009}

Here is the content my data file: 0.0479718 0.0703287 0.220258 0.626224 0.0352174 19.240423468414097 0.064742 0.0902062 0.37397 0.255238 0.215845 20.49860803932102 0.0618858 0.160758 0.247392 0.425238 0.104726 20.864252498791814 0.0360865 0.249096 0.121834 0.41584 0.177143 19.974078332267265 0.0313427 0.603402 0.0460963 0.203438 0.11572 19.400004284735683 0.150024 0.0721107 0.0765948 0.0654681 0.635802 20.70922643742188 0.148392 0.0922901 0.391548 0.261696 0.106074 20.180272416364566 0.101979 0.227968 0.0889244 0.471759 0.10937 21.245069536720766 0.198051 0.46969 0.0151316 0.0807954 0.236332 21.038253339089763 0.140563 0.466464 0.314725 0.052829 0.0254198 21.035286502114463

Environment: OS: Arch Linux python Version 3.10.10 numpy Version 1.24.2 scipy Version 1.10.1 bayesian-optimization Version 1.4.2

bwheelz36 commented 1 year ago

Hi @Kertin kappa controls how exploratory the algorithm is, as described in this notebook. So I would certainly expect different behavior with different kappa values.

I'm struggling to understand the problem. You say that in case 2, you always got the same point to probe but that doesn't seem to be the case? Maybe you can plot the data instead of pasting it so I don't have to try and diagnose the problem from pure text :-)

till-m commented 1 year ago

I will close this for now. Feel free to reopen with more information and a runnable script.

CConory commented 5 months ago

I also meet the same issuse!!! No matter how many data the optimizer register, the suggest will always same! Just like the Ouput in the command 1&2&3 above.

till-m commented 5 months ago

@CConory if you have a runnable script, please open a seperate issue and we can have a look at your problem.