brjathu / iTAML

Official implementation of "iTAML : An Incremental Task-Agnostic Meta-learning Approach". CVPR 2020
96 stars 16 forks source link

a few question about the implementation #1

Closed Ereebay closed 4 years ago

Ereebay commented 4 years ago

Hello Jathushan Rajasegaran, thanks for your nice work. I have some questions about the implementation of the model in the meta train and meta test parts.

In the paper, this model consists of a meta-training process and an inference process. The inference process consists of task prediction and class prediction. In the code, is the meta_test() function means the inference process?

And the function meta_test() consists of three parts:

I guess the meta training in the meta_test() is the adaptation process mentioned in the inference process. And the meta test without task knowledge is the task prediction. But why does the task prediction happens after the adaptation in the code? It seems that you directly use the task_id to do the adaptation and class prediction.

brjathu commented 4 years ago

Hi @Ereebay, Thanks for your interest in our work, meta_test() is the adaptation process in the paper.

"I guess the meta training in the meta_test() is the adaptation process mentioned in the inference process. And the meta test without task knowledge is the task prediction." this is correct.

"But why does the task prediction happens after the adaptation in the code? It seems that you directly use the task_id to do the adaptation and class prediction." - On implementation wise, I do adaptation for all the tasks and store them, and the actual testing happens in the plot_cifar.ipynb file. Mainly because of computational efficiency so that I don't have to use GPUs and gradient updates at test time.

hope this answers your questions. feel free to ask if you have any questions.

Ereebay commented 4 years ago

Hi @brjathu, thanks for your reply, I have one more question about the parameters.

The model parameters contain the features parameters(theta) and the classifier's parameters (phi). In order to keep multi classifiers parameters, it can copy the base model parameters. But, the parameters of the features should be updated through all the tasks. In the code, it seems to be reset to the base too.

            for task_idx in range(1+self.args.sess):
                idx = np.where((np_targets>= task_idx*self.args.class_per_task) & (np_targets < (task_idx+1)*self.args.class_per_task))[0]
                ai = self.args.class_per_task*task_idx
                bi = self.args.class_per_task*(task_idx+1)

                ii = 0
                if(len(idx)>0):
                    sessions.append([task_idx, ii])
                    ii += 1
                    for i,(p,q) in enumerate(zip(model.parameters(), model_base.parameters())):
                        p=copy.deepcopy(q)
brjathu commented 4 years ago

Hi, at the start of each inner loop update, we move from the model_base parameters to a task-specific parameter?

when copying, theta, and phi, both are copied, but only the theta and a part of phi corresponding that task is updated with gradients in the inner loop.

Ereebay commented 4 years ago

Hi, at the start of each inner loop update, we move from the model_base parameters to a task-specific parameter?

when copying, theta, and phi, both are copied, but only the theta and a part of phi corresponding that task is updated with gradients in the inner loop.

So the theta doesn't be updated through all the tasks? Because it shows that the model will copy the base model's parameters every task.

brjathu commented 4 years ago

no, it will. after finishing a single loop starting from 122, it will replace model base with model. see line 121,

        model_base = copy.deepcopy(model)
Ereebay commented 4 years ago

θ is updated in the inner loop for all tasks

My confusion is about this sentence. I thought the theta is updated from the first to the last tasks. I'm afraid I misunderstood it. Does this mean theta updated form the base model for every task right? Not the previous task.

brjathu commented 4 years ago

"Does this mean theta updated form the base model for every task right? Not the previous task." yes this is correct. sorry for the confusion. Although theta is updated form the base model for every task right, for each batch would be more meaningfull.

Ereebay commented 4 years ago

Thanks for your patient explanation again! It solves my confusion.