SciML / Surrogates.jl

Surrogate modeling and optimization for scientific machine learning (SciML)
https://docs.sciml.ai/Surrogates/stable/
Other
328 stars 70 forks source link

Gradient-enhanced surrogates #156

Open ChrisRackauckas opened 4 years ago

ChrisRackauckas commented 4 years ago
ludoro commented 4 years ago

Nice thanks, they could be good assignment for the MLH students

vikram-s-narayan commented 2 years ago

I would like to work on Gradient Enhanced Kriging.

vikram-s-narayan commented 2 years ago

@ChrisRackauckas and @ranjanan - I've coded up a very rough version of GEKPLS as a bunch of functions here.

All of my code is a translation of the SMT python code. The code is not fully tested. And there are still some kinks and bugs that I'm working out but I thought I'd share this early and get feedback :)

In the example that I have provided in the gist, the underlying function simply returns x1^2 + x2^2 + x3^2 (given an array with components x1,x2 and x3)

If this looks okay, I can begin to add it as a surrogate and begin refining and optimizing the code.

Apart from the above RFC, I also have a question:

I'm using ScikitLearn's PLS (@sk_import cross_decomposition: PLSRegression). This will require us to add the following line in the main Surrogates.jl file: __precompile__(false)

This is needed because of this issue in ScikitLearn.jl

I hope this is okay?

ChrisRackauckas commented 2 years ago

It's a start. We won't want the final version to use ScikitLearn as that would cause some packaging issues (PyCall is hard to build into sysimages for example). But to get a working version, that's a good way to start, then add some tests, and replace pieces one-by-one.

vikram-s-narayan commented 2 years ago

OK. I'll start cleaning this up and search for an alternative for ScikitLearn PLS. Thanks!

vikram-s-narayan commented 2 years ago

@ChrisRackauckas - I've created a draft pull request for GEKPLS with some basic tests added in. This still uses the SK Learn PLS Regressor which I plan to replace. With regard to a Julia PLS Regressor there is one called - PartialLeastSquaresRegressor.jl - but it has a few issues (ex: it does not have an attribute called 'x_rotations' which is what we use from SKLearn's PLS).

Hence, I'm now planning on writing our own PLS function based on the SKLearn PLS. I plan to take only the parts that are needed for GEKPLS. Is this approach of writing our own PLS function okay?

ChrisRackauckas commented 2 years ago

That sounds great.