STOR-i / GaussianProcesses.jl

A Julia package for Gaussian Processes
https://stor-i.github.io/GaussianProcesses.jl/latest/
Other
308 stars 53 forks source link

Update to 0.7 and 1.0 #87

Closed devmotion closed 6 years ago

devmotion commented 6 years ago

I fixed all deprecation warnings showing up on Julia 0.7 and updated the package to Julia 0.7 and 1.0 (however, benchmark scripts and README are not updated yet). Since some of the math formulas in the doc strings caused issues on the newer Julia versions, I reformatted most of them according to the official documentation.

I checked that tests pass on Julia 0.7 and Julia 1.0 locally with the latest Optim.jl master (as soon as https://github.com/JuliaLang/METADATA.jl/pull/16707 is merged the CI tests should succeed as well). Since the tests had to be updated for 0.7 and 1.0 anyway I cleaned them and reorganized them into different @testsets. Moreover, I added CI tests on WIndows.

I hope my changes help in the transition of this package to the newest Julia version.

devmotion commented 6 years ago

Following the release of Optim, tests succeed on Travis and Appveyor.

fairbrot commented 6 years ago

Wow, thanks for putting the effort into doing this! We'll need to some time to review this properly, but from a first glance the changes look good. What do you think @chris-nemeth and @maximerischard ?

devmotion commented 6 years ago

I noticed that somehow I assumed incorrectly that VecF64 and MatF64 are aliases for Vector{Float64} and Matrix{Float64}, so I'll revert some changes where I've introduced fields with these abstract types. By the way, in general one should avoid fields with abstract type, so I'll try to fix fields with other abstract types as well. It's just not clear to me what's the best way to deal with priors. Speaking of parameters, a separate type for parameters might be nice since one could e.g. add a field fixed::Bool that specifies whether a parameter is fixed and get rid of FixedKern which contains a type instability in free, it seems.

chris-nemeth commented 6 years ago

@devmotion thanks for making these changes, it's much appreciated. You're right about setting up parameter types and this is something we are planning to do. We also felt that this would be the best way of handling the priors issue. We should set-up a new issue for parameter types as I think this now top of our todo list.

devmotion commented 6 years ago

Unfortunately, I had no time to work on the PR in the last days (but I still think the VecF64 and MatF64 should be quite easy to fix).

However, I'm not sure anymore whether a separate parameter type is actually a good idea. Means, kernels, and likelihoods with standard floating point parameters (or, maybe even better for autodifferentiation, parameters of a general real type) analogously to e.g. distributions in Distributions.jl seem a lot more flexible: no user is forced to specific parameter types and limited by their implementation but can freely create new mean functions, kernel functions, or likelihood functions with different parameter values. This would also allow to get rid of the params! implementations which currently perform both parameter transformations and updates; restrictions of parameter values to specific intervals such as the positive real line could be handled only in inner constructors (which would also allow the computation of transformed values that can simplify computations; everything again similar to the implementation of Distributions.jl). For parameter estimation and MCMC unconstrained parameter values could be mapped to desired intervals with the help of https://github.com/tpapp/ContinuousTransformations.jl; moreover, a modular approach to parameter estimation similar to https://github.com/JuliaDiffEq/DiffEqBayes.jl could be helpful for integrating different inference algorithms, might allow simpler handling of priors only at inference time, and could allow users to specify how their GP is updated with new parameters (in addition to a default implementation, similar to what is done in https://github.com/JuliaDiffEq/DiffEqParamEstim.jl; this would allow all kinds of inference schemes and probably would remove the need for a special handling of fixed parameters in the package itself). But surely, even if some of these random ideas might be useful, they should not be included in this PR :smile:

chris-nemeth commented 6 years ago

Thanks for making these changes. Before we accept this PR, we just wanted to check whether you were planning to add anything else?

Also, thanks for the suggestions regarding the further development of the package. These are interesting ideas and hopefully something we could discuss further.

devmotion commented 6 years ago

Before we accept this PR, we just wanted to check whether you were planning to add anything else?

No, I think it's better to open new PRs for additional changes that are not directly related to the upgrade to Julia 0.7 and 1.0. I just polished the README and updated one benchmark script; there are still two other scripts that have to be updated and the tutorials should be updated to the newest Julia versions as well, I guess, but in my opinion this can be done in a separate PR. So if you're satisfied with the current version of this PR, then these were the last changes to this PR :smiley:

The configuration files and badges for the CI services (Travis, Appveyor, Codecov, and Coveralls) are all included in this PR but since some (or even all?) services haven't been used so far someone has to enable them for this repository (I'm not allowed to do that, hence so far I enabled them only for my fork and checked that the tests are successful, see https://ci.appveyor.com/project/devmotion/gaussianprocesses-jl and https://travis-ci.org/devmotion/GaussianProcesses.jl). Moreover, I'm pretty sure that the status badge for Appveyor has to be updated with the correct security token (see https://www.appveyor.com/docs/status-badges/) once Appveyor is activated; all other badges should work since they use predictable urls without any special tokens.

fairbrot commented 6 years ago

Thanks again for this. We've noticed that another branch faster_sum which @maximerischard was working a few months ago was not merged into master. This branch contains potentially important efficiency improvements to the package, but there are big conflicts with this pull request which could take a while to resolve. In the interest of making the package available for Julia v0.7/1.0 quickly we are accepting this pull request, and will study and integrate the changes on the faster_sum branch in the coming weeks.

devmotion commented 6 years ago

Ah yeah, I didn't pay attention to all those different branches, sorry :grin:

maximerischard commented 5 years ago

Hi all, First of all, thank you so much for this pull request @devmotion, it's really great to be compatible with julia v0.7 already. Sorry I was traveling these last few weeks and not replying to this thread. I've just spent a fun few hours merging the updates with the faster_sum branch. My benchmarks show some slight speed-ups (~5%) in computing the marginal likelihood and its derivative, which is really great.