resibots / limbo

A lightweight framework for Gaussian processes and Bayesian optimization of black-box functions (C++11)
http://www.resibots.eu/limbo
Other
243 stars 52 forks source link

review of guides/bo.rst #130

Closed dogoepp closed 8 years ago

dogoepp commented 8 years ago

This is for easier discussion of pull request #115.

guides/bo.rst

around line 4

References to bibliographical elements seem to be buggy. The links for [1][5] point to the limbo_concepts.rst page, although the bibliography at the bottom of the bo.rst page correctly points to the top of the page. This might be linked to the warnings shown during the build process.

See 3578c27 for a proposed solution. This proposal requires that a prefix be added when doing a citation. For instance,

:cite:`brochu2010tutorial,Mockus2013`

becomes

:cite:`a-brochu2010tutorial,a-Mockus2013`

The prefix b- would be used in limbo_concepts.rst

Does the fix suit you ?

around line 31

Why is there no reference to $\chi{0:t}$ in
$P(f(\mathbf{x})|\mathbf{P}
{1:t},\mathbf{x}) = \mathcal{N}(\mu{t}(\mathbf{x}), \sigma{t}^2(\mathbf{x}))$ ?

around line 45

Our implementation of Bayesian optimization uses this Gaussian process model to search for the maximum of the objective function :math:f(\mathbf{x}), :math:f(\mathbf{x}) being unknown.

Proposed alternatives:

Here we use the "upper confidence bound" acquisition function

Is it part of the normal (and documented) use case to define an other acquisition function ? If so, "Here" might be misleading.

around line 45

The model is then progressively refined after each observation.

Could we remove the "then" ?

around line 58

What does the sentence bellow mean ? I don't see a link with the context above it.

We use a log because it makes the optimization simpler and does not change the result.

around line 74

what about the

.. todo:: list the optimization algorithms

Sections on hyper-parameters and kernel function

The two above-mentioned sections are not very well tuned together. The first one, on hyper-parameters talks about the kernel function that is only better explained later. It also shows computation with the squared exponential covariance function whereas the section on kernel function uses the matern kernel.

costashatz commented 8 years ago

Does the fix suit you ?

I think it is good enough.. Just document it in gitlab.

around line 31

Why is there no reference to $\chi_{0:t}$ in

$P(f(\mathbf{x})|\mathbf{P}{1:t},\mathbf{x}) = \mathcal{N}(\mu{t}(\mathbf{x}), \sigma_{t}^2(\mathbf{x}))$ ?

around line 45

Our implementation of Bayesian optimization uses this Gaussian process model to search for the maximum of the objective function :math:f(\mathbf{x}), :math:f(\mathbf{x}) being unknown. Proposed alternatives:

"... of the unknown objective function :math:f(\mathbf{x})." "... of the objective function :math:f(\mathbf{x}), which is unknown." around line 45

Here we use the "upper confidence bound" acquisition function Is it part of the normal (and documented) use case to define an other acquisition function ? If so, "Here" might be misleading.

around line 45

The model is then progressively refined after each observation. Could we remove the "then" ?

around line 58

What does the sentence bellow mean ? I don't see a link with the context above it.

We use a log because it makes the optimization simpler and does not change the result.

We already discussed these. I am fixing them..

around line 74

what about the

.. todo:: list the optimization algorithms

Sections on hyper-parameters and kernel function

The two above-mentioned sections are not very well tuned together. The first one, on hyper-parameters talks about the kernel function that is only better explained later. It also shows computation with the squared exponential covariance function whereas the section on kernel function uses the matern kernel.

@jbmouret have a look at them..

dogoepp commented 8 years ago

I think it is good enough.. Just document it in gitlab.

where do you mean ?

costashatz commented 8 years ago

where do you mean ?

Here I guess? Or somewhere with guidelines on how to put references in our docs..

costashatz commented 8 years ago

I did the changes in b0e9883. Waiting for @jbmouret to merge..

dogoepp commented 8 years ago

There is no technical page on the writing in Sphinx. I could either fit it in the install process or the update process. Creating a whole new page for a single item seems a lot.

costashatz commented 8 years ago

There is no technical page on the writing in Sphinx. I could either fit it in the install process or the update process. Creating a whole new page for a single item seems a lot.

Update process seems like a better choice to me..

dogoepp commented 8 years ago

Will do

dogoepp commented 8 years ago

fixed