Closed wxyzjp123 closed 12 months ago
Yes, you can! But keep in mind most optimization problems in ML models are defined to be solved efficiently (and are in fact single objective). However, you can of course treat them as a black-box problem as well. The example below illustrates this for Ridge Regression.
import numpy as np
from sklearn import linear_model
from pymoo.algorithms.soo.nonconvex.de import DE
from pymoo.core.problem import ElementwiseProblem
from pymoo.operators.sampling.lhs import LHS
from pymoo.optimize import minimize
A = np.array([[0, 0], [0, 0], [1, 1]])
b = np.array([0, .1, 1])
alpha = 0.5
reg = linear_model.Ridge(alpha=alpha)
reg.fit(A, b)
print(reg.intercept_, reg.coef_)
class RidgeRegression(ElementwiseProblem):
def __init__(self):
super().__init__(n_var=3, n_obj=1, xl=-10, xu=+10)
def _evaluate(self, x, out, *args, **kwargs):
coef_ = x[:2]
intercept_ = x[2]
y_hat = A @ coef_ + intercept_
out['coef'] = coef_
out['intercept'] = intercept_
out['F'] = ((b - y_hat) ** 2).sum() + alpha * (coef_ ** 2).sum()
problem = RidgeRegression()
algorithm = DE(
pop_size=100,
sampling=LHS(),
variant="DE/rand/1/bin",
CR=0.3,
dither="vector",
jitter=False
)
res = minimize(problem,
algorithm,
seed=1,
verbose=False)
sol = res.opt[0]
print(sol.get('intercept'), sol.get('coef'))
A whole other topic is to optimize the hyperparameters of ML models. Please search for hyper parameter optimization to get more information about this.
PS: Please open a discussion and not an issue as this is nothing that needs to be added as a feature or fixed.
Hi author, I'm a newbie to pymoo and would like to ask if there are any simple code examples that use a multi-objective optimisation algorithm(such as NSAG-2) to update the coefficient vectors of a machine learning model (such as logstic regression or svm), thank you very much!