google-research / l2p

Learning to Prompt (L2P) for Continual Learning @ CVPR22 and DualPrompt: Complementary Prompting for Rehearsal-free Continual Learning @ ECCV22
https://arxiv.org/pdf/2112.08654.pdf
Apache License 2.0
409 stars 41 forks source link

Regarding transferring previous learned prompt params to the new prompt #36

Open prachigarg23 opened 1 year ago

prachigarg23 commented 1 year ago

Hi @KingSpencer @zizhaozhang , I have a doubt regarding prompt pool initialization in every new task, inside train_continual.py > train_and_evaluate_per_task(). Why do you transfer previously learned prompt keys to the new prompt parameters in prompt pool and prompt key (lines 550-584)?

Based on my understanding from the paper, the prompt pool and prompt key are shared across tasks and the method learns to select the relevant ones using query function-key matching. So if a subsequent incremental task is initializing prompts from the previous step, why are the prompts being shifted? Are new prompts being added at every step or am I missing something?

Any help will be appreciated.

kimsekeun commented 11 months ago

Did you find reasons?