numericalalgorithmsgroup / pybobyqa

Python-based Derivative-Free Optimization with Bound Constraints
https://numericalalgorithmsgroup.github.io/pybobyqa/
GNU General Public License v3.0
78 stars 18 forks source link

Returning an optimization trajectory #27

Closed jungtaekkim closed 1 year ago

jungtaekkim commented 1 year ago

Hi,

Thank you for developing a valuable package.

I would like to ask if there is a way to return an optimization trajectory.

I found that it is possible to log optimization progress in this link (https://numericalalgorithmsgroup.github.io/pybobyqa/build/html/userguide.html#adding-bounds-and-more-output), but I would like to get the optimization progress in the returned result object.

Thank you in advance,

Best regards, Jungtaek.

lindonroberts commented 1 year ago

Thanks Jungtaek! The closest built-in functionality is to use

user_params = {'logging.save_diagnostic_info': True, 'logging.save_xk': True}
soln = pybobyqa.solve(..., user_params=user_params)

print(soln.diagnostic_info)  # Pandas dataframe with columns xk and fk

The output dataframe column xk has the best input x so far (at the end of each iteration) and column fk has the corresponding objective value.

If you want all the evaluation information (not just the best so far), then I suggest a workaround like this, where you wrap your objective function in a class which saves all evaluation information:

class Objfun(object):
    def __init__(self, f):
        self.function = f
        self.xs = []
        self.fs = []
        return

    def __eval__(self, x):
        fval = self.function(x)
        self.xs.append(x)
        self.fs.append(fval)
        return fval

# Assuming you have your function f(x) defined already
objfun = Objfun(f)
soln = pybobyqa.solve(objfun, ...)
print(objfun.xs)
print(objfun.fs)
jungtaekkim commented 1 year ago

Thank you!

It is a brilliant solution.

I am going to close this issue.