RobvanGastel / meta-fsl-nas

Repository for the "Regularized Meta-Learning for Neural Architecture Search" paper
0 stars 0 forks source link

HELLO! #1

Closed Sameul-kaji closed 2 years ago

Sameul-kaji commented 2 years ago

I am very glad to meet you. Recently, I also read the paper named Metanas, and I want to improve it. I feel very interesting to find someone who has similar ideas with me. I was wondering why I was running your code slower and using more gpu video memory than the original code. From my understanding, the searchcnn model should be the same in stage 1.

RobvanGastel commented 2 years ago

Hi, thank you for the interest in my code. There is currently a lot of unfinished work in my code base, in the main branch Im experimenting with RL architecture search, which is still exploratory. In the task ablation branch I attempted more adjustments to the DARTS task-learner.

Sameul-kaji commented 2 years ago

Your idea is great!My work is also focused on attempting more adjustment to the task-optimizer and perhaps experimenting with more different neural architecture search methods.Unfortunately, I don't know much about reinforcement learning.And I see in the paper that the author is trying to modified the meta-learner by use MAML++,maybe this is another way. Thank you very much for communicating with you, your code inspired me a lot.