kjappelbaum / pyepal

Multiobjective active learning with tunable accuracy/efficiency tradeoff and clear stopping criterion.
Apache License 2.0
38 stars 5 forks source link

Implement neural tangent kernel #112

Closed kjappelbaum closed 4 years ago

kjappelbaum commented 4 years ago

Feature description

Have a pyepal class that supports neural tangent kernels.

Implementation idea

Use the neural_tangents package and take as input a kernel defined by the user as

init_fn, apply_fn, kernel_fn = stax.serial(
stax.Dense(2048, W_std=1.5, b_std=0.05), stax.Erf(),
stax.Dense(2048, W_std=1.5, b_std=0.05), stax.Erf(),
stax.Dense(1, W_std=1.5, b_std=0.05))

Then the _predict function would just call

y_mean, y_var = predict.gp_inference(kernel_fn, x_train, y_train, x_test, 'ntk',
diag_reg=1e-4, compute_cov=True)

We will need to inspect the network to make sure that the last layer has the correct number of outputs.

Unclear to me is still how we do with hyperparameter optimization, I guess we can skip this here.

As LightGBM, neural_tangent would be an optional dependency.

issue-label-bot[bot] commented 4 years ago

Issue-Label Bot is automatically applying the label feature_request to this issue, with a confidence of 0.99. Please mark this comment with :thumbsup: or :thumbsdown: to give our bot feedback!

Links: app homepage, dashboard and code for this bot.

kjappelbaum commented 4 years ago

i'd probably give it a shot tonight/tomorrow, any additional thoughts that I should consider, @byooooo

kjappelbaum commented 4 years ago

Basically we have two use-cases

kjappelbaum commented 4 years ago

So, looking a bit into it, the multihead case seems to be currently a bit tricky, so we will also take a list of models