Closed Gasp34 closed 2 years ago
Hi,
Thank you we're glad you found the paper interesting.
With the current procedure, you can either train multiple heads one for each dataset or concatenate them as you said.
We're working on a new version of the repo which will include this feature.
Hope this answers your question and sorry for the late response.
Best,
Hi,
Great paper ! I have a question regarding the training procedure with multiples datasets. If I have lets say 6 datasets with 5 class each, with the old training procedure of MAML or ProtoNet, where the model is trained with episodes and tasks, this seems simple enough since each data set would be a task. With your training procedure (which seems to be used in all recent papers), how would you do it ? Concatenate all the classes to have 30 classes and train with batches made of samples of all datasets ?
Thanks a lot if you have the time to answer :)