Closed murphyk closed 2 years ago
Working on this, thanks!
Hello, I'd also like to work on this issue if possible. One question I have about the task is does it encapsulate the entire finite_ntk
repo? That is, should we look to replicate all the experiments, experiment baselines, and plots within the repo in JAX? Or should we replicate a subset of the repo (ex. just the plots in the notebooks
folder and the implementation of the linearized DNNS for GPs specifically in the xfer/finite_ntk/finite_ntk
folder)
Actually I discovered that the jax neural tangents library already has a linearixze method. Do the goal would just be to replicate one of the transfer learning experiments
On Fri, Apr 8, 2022 at 7:15 PM Ting Chen @.***> wrote:
Hello, I'd also like to work on this issue if possible. One question I have about the task is does it encapsulate the entire finite_ntk repo? That is, should we look to replicate all the experiments, experiment baselines, and plots within the repo in JAX? Or should we replicate a subset of the repo (ex. just the plots in the notebooks folder and the implementation of the linearized DNNS for GPs specifically in the xfer/finite_ntk/finite_ntk folder)
— Reply to this email directly, view it on GitHub https://github.com/probml/pyprobml/issues/724#issuecomment-1093592731, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABDK6EAUX6LT6WA3D6T7LC3VEDR43ANCNFSM5SJMK24A . You are receiving this because you authored the thread.Message ID: @.***>
-- Sent from Gmail Mobile
I am closing this since it is too low priority.
Translate https://github.com/amzn/xfer/tree/master/finite_ntk to JAX. Use https://github.com/google/neural-tangents for the NTK kernel, and https://github.com/dfm/tinygp for the GP code.