martinjankowiak / saasbo

SAASBO: a package for high-dimensional bayesian optimization
34 stars 5 forks source link

How to solve problem with constraints? #4

Open jacktang opened 4 months ago

jacktang commented 4 months ago

Hello, I modified branin100 problem with constraints as below

lb = np.hstack((-5 * np.ones(50), 0 * np.ones(50)))   # lower bounds for input domain
ub = np.hstack((10 * np.ones(50), 30 * np.ones(50)))  # upper bounds for input domain

def branin100_with_constraints(x):
    assert (x <= ub).all() and (x >= lb).all()
    cv = x[(x > 3)]
    cv_mean = np.mean(cv)
    print("cv-mean=", cv_mean)
    if cv_mean > 12.5:
        return 1e5

    x1 = np.mean(x[0:19])
    x2 = np.min(x[19:64])
    x3 = np.max(x[64:80])
    t1 = x2 - 5.1 / (4 * math.pi ** 2) * x1 ** 2 + 5 / math.pi * x1 - 6
    t2 = 10 * (1 - 1 / (8 * math.pi)) * np.cos(x1) + np.sin(x3)
    return t1 ** 2 + t2 + 10

the constraint is np.mean(x[(x > 3)]) > 12.5, and I found saasbo can't find feasible solution. I also run the problem using NSGA-II, it returns the optimized result (4.68769846). So can you give some advice on solving problem with constraints? Thanks!

martinjankowiak commented 4 months ago

the easiest thing to do is add a soft constraint via a differentiable function, something like

margin = cv_mean - 12.5
penalty = - torch.where(margin > 0, 100 * torch.exp(-1 / margin), 0.0)
jacktang commented 4 months ago

@martinjankowiak I'm not quite understand your remark. Do you mean add penalty soft constraint to objective function like below?

def branin100_with_constraints(x):
    ....
    penalty = torch.where(margin > 0, 100 * torch.exp(-1 / margin), 0.0)
    return t1 ** 2 + t2 + 10 + torch.mean(penalty)
martinjankowiak commented 4 months ago

yes although not sure about that mean you added. this is a direct analog of your

    if cv_mean > 12.5:
        return 1e5

but is differentiable whereas your intervention is not