PKU-DAIR / open-box

Generalized and Efficient Blackbox Optimization System
https://open-box.readthedocs.io
Other
380 stars 53 forks source link
automatic-machine-learning bayesian-optimization blackbox-optimization constrained-optimization distributed-systems hyper-parameter-optimization multi-objective-optimization saas

OpenBox Logo


license Issues Pull Requests Version Test Documentation Status codecov

OpenBox Documentation | OpenBox中文文档 | 中文README

OpenBox: Generalized and Efficient Blackbox Optimization System

OpenBox is an efficient and generalized blackbox optimization (BBO) system, which supports the following characteristics: 1) BBO with multiple objectives and constraints, 2) BBO with transfer learning, 3) BBO with distributed parallelization, 4) BBO with multi-fidelity acceleration and 5) BBO with early stops. OpenBox is designed and developed by the AutoML team from the DAIR Lab at Peking University, and its goal is to make blackbox optimization easier to apply both in industry and academia, and help facilitate data science.

Software Artifacts

Standalone Python package.

Users can install the released package and use it with Python.

Distributed BBO service.

We adopt the "BBO as a service" paradigm and implement OpenBox as a managed general service for black-box optimization. Users can access this service via REST API conveniently, and do not need to worry about other issues such as environment setup, software maintenance, programming, and optimization of the execution. Moreover, we also provide a Web UI, through which users can easily track and manage the tasks.

Design Goal

The design of OpenBox follows the following principles:

Links

News

OpenBox Capabilities in a Glance

Build-in Optimization Components Optimization Algorithms Optimization Services
  • Surrogate Model
    • Gaussian Process
    • TPE
    • Probabilistic Random Forest
    • LightGBM
  • Acquisition Function
    • EI
    • PI
    • UCB
    • MES
    • EHVI
    • TS
  • Acquisition Optimizer
    • Random Search
    • Local Search
    • Interleaved RS and LS
    • Differential Evolution
    • L-BFGS-B
  • Bayesian Optimization
    • GP-based BO
    • SMAC
    • TPE
    • LineBO
    • SafeOpt
  • Multi-fidelity Optimization
    • Hyperband
    • BOHB
    • MFES-HB
  • Evolutionary Algorithms
    • Surrogate-assisted EA
    • Regularized EA
    • Adaptive EA
    • Differential EA
    • NSGA-II
  • Others
    • Anneal
    • PSO
    • Random Search

Installation

System Requirements

Installation Requirements:

Supported Systems:

We strongly suggest you to create a Python environment via Anaconda:

conda create -n openbox python=3.8
conda activate openbox

Then we recommend you to update your pip, setuptools and wheel as follows:

pip install --upgrade pip setuptools wheel

Installation from PyPI

To install OpenBox from PyPI:

pip install openbox

For advanced features, install SWIG first and then run pip install "openbox[extra]".

Manual Installation from Source

To install the newest OpenBox from the source code, please run the following commands:

git clone https://github.com/PKU-DAIR/open-box.git && cd open-box
pip install .

Also, for advanced features, install SWIG first and then run pip install ".[extra]".

For more details about installation instructions, please refer to the Installation Guide.

Quick Start

A quick start example is given by:

import numpy as np
from openbox import Optimizer, space as sp

# Define Search Space
space = sp.Space()
x1 = sp.Real("x1", -5, 10, default_value=0)
x2 = sp.Real("x2", 0, 15, default_value=0)
space.add_variables([x1, x2])

# Define Objective Function
def branin(config):
    x1, x2 = config['x1'], config['x2']
    y = (x2-5.1/(4*np.pi**2)*x1**2+5/np.pi*x1-6)**2+10*(1-1/(8*np.pi))*np.cos(x1)+10
    return {'objectives': [y]}

# Run
if __name__ == '__main__':
    opt = Optimizer(branin, space, max_runs=50, task_id='quick_start')
    history = opt.run()
    print(history)

The example with multi-objectives and constraints is as follows:

import matplotlib.pyplot as plt
from openbox import Optimizer, space as sp

# Define Search Space
space = sp.Space()
x1 = sp.Real("x1", 0.1, 10.0)
x2 = sp.Real("x2", 0.0, 5.0)
space.add_variables([x1, x2])

# Define Objective Function
def CONSTR(config):
    x1, x2 = config['x1'], config['x2']
    y1, y2 = x1, (1.0 + x2) / x1
    c1, c2 = 6.0 - 9.0 * x1 - x2, 1.0 - 9.0 * x1 + x2
    return dict(objectives=[y1, y2], constraints=[c1, c2])

# Run
if __name__ == "__main__":
    opt = Optimizer(CONSTR, space, num_objectives=2, num_constraints=2,
                    max_runs=50, ref_point=[10.0, 10.0], task_id='moc')
    history = opt.run()
    history.plot_pareto_front()  # plot for 2 or 3 objectives
    plt.show()

We also provide HTML Visualization. Enable it by setting additional options visualization=basic/advanced and auto_open_html=True(optional) in Optimizer:

opt = Optimizer(...,
    visualization='advanced',  # or 'basic'. For 'advanced', run 'pip install "openbox[extra]"' first
    auto_open_html=True,       # open the visualization page in your browser automatically
)
history = opt.run()

For more visualization details, please refer to HTML Visualization.

More Examples:

Enterprise Users

Tencent Logo Alibaba Logo Kuaishou Logo

Contributing

OpenBox has a frequent release cycle. Please let us know if you encounter a bug by filling an issue.

We appreciate all contributions. If you are planning to contribute any bug-fixes, please create a pull request.

If you plan to contribute new features, new modules, etc. please first open an issue or reuse an existing issue, and discuss the feature with us.

To learn more about making a contribution to OpenBox, please refer to our How-to contribution page.

We appreciate all contributions and thank all the contributors!

Feedback

Related Projects

Targeting at openness and advancing AutoML ecosystems, we had also released few other open-source projects.

Related Publications

OpenBox: A Python Toolkit for Generalized Black-box Optimization.

Huaijun Jiang, Yu Shen, Yang Li, Beicheng Xu, Sixian Du, Wentao Zhang, Ce Zhang, Bin Cui; JMLR 2024, CCF-A. [paper] [arxiv]

OpenBox: A Generalized Black-box Optimization Service.

Yang Li, Yu Shen, Wentao Zhang, Yuanwei Chen, Huaijun Jiang, Mingchao Liu, Jiawei Jiang, Jinyang Gao, Wentao Wu, Zhi Yang, Ce Zhang, Bin Cui; KDD 2021, CCF-A. [paper] [arxiv]

MFES-HB: Efficient Hyperband with Multi-Fidelity Quality Measurements.

Yang Li, Yu Shen, Jiawei Jiang, Jinyang Gao, Ce Zhang, Bin Cui; AAAI 2021, CCF-A. [paper] [arxiv]

Transfer Learning based Search Space Design for Hyperparameter Tuning.

Yang Li, Yu Shen, Huaijun Jiang, Tianyi Bai, Wentao Zhang, Ce Zhang, Bin Cui; KDD 2022, CCF-A. [paper] [arxiv]

TransBO: Hyperparameter Optimization via Two-Phase Transfer Learning.

Yang Li, Yu Shen, Huaijun Jiang, Wentao Zhang, Zhi Yang, Ce Zhang, Bin Cui; KDD 2022, CCF-A. [paper] [arxiv]

PaSca: a Graph Neural Architecture Search System under the Scalable Paradigm.

Wentao Zhang, Yu Shen, Zheyu Lin, Yang Li, Xiaosen Li, Wen Ouyang, Yangyu Tao, Zhi Yang, and Bin Cui; WWW 2022, CCF-A, 🏆 Best Student Paper Award. [paper] [arxiv]

Hyper-Tune: Towards Efficient Hyper-parameter Tuning at Scale.

Yang Li, Yu Shen, Huaijun Jiang, Wentao Zhang, Jixiang Li, Ji Liu, Ce Zhang, Bin Cui; VLDB 2022, CCF-A. [paper] [arxiv]

Citing

If you use OpenBox, please consider citing the following articles:

@inproceedings{li2021openbox,
  title={Openbox: A generalized black-box optimization service},
  author={Li, Yang and Shen, Yu and Zhang, Wentao and Chen, Yuanwei and Jiang, Huaijun and Liu, Mingchao and Jiang, Jiawei and Gao, Jinyang and Wu, Wentao and Yang, Zhi and others},
  booktitle={Proceedings of the 27th ACM SIGKDD conference on knowledge discovery \& data mining},
  pages={3209--3219},
  year={2021}
}

@article{JMLR:v25:23-0537,
  author  = {Huaijun Jiang and Yu Shen and Yang Li and Beicheng Xu and Sixian Du and Wentao Zhang and Ce Zhang and Bin Cui},
  title   = {OpenBox: A Python Toolkit for Generalized Black-box Optimization},
  journal = {Journal of Machine Learning Research},
  year    = {2024},
  volume  = {25},
  number  = {120},
  pages   = {1--11},
  url     = {http://jmlr.org/papers/v25/23-0537.html}
}

License

The entire codebase is under MIT license.