leopard-ai / betty

Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
https://leopard-ai.github.io/betty/
Apache License 2.0
329 stars 27 forks source link

[Question] Code for the continued pretraining experiment #23

Closed kykim0 closed 1 week ago

kykim0 commented 1 week ago

Hello!

I was hoping to reproduce the continued pretraining experiment in the SAMA paper and was curious where I could find the code. The learning_by_ignoring example looks quite relevant, but the datasets use there seem different from the ones used in the experiment. Could you point me to the experiment code?

Thanks!

sangkeun00 commented 1 week ago

Hello,

Thanks for your interest in our work. I have one clarification question:

learning_by_ignoring is actually tri-level optimization (instead of bi-level optimization), and it’s the example used in our previous paper on Betty instead of in our SAMA paper.

If you are interested in the second experiment of our SAMA paper, you can find the (old) code in the supplemental material for our NeurIPS submission.

If you are interested in something else, please let me know.

Best, Sang

kykim0 commented 1 week ago

Thank you for your quick response and the code pointer!