Closed libo-huang closed 2 years ago
Thank you for bringing GFR method to our attention. After carefully reading GFR paper, we notice that the ``Ours Gaussian'' in GFR also uses class prototypes. For each old class, the method save the class mean and covariance matrix for replay. While memorizing covariance matrix for each class suffers from memory limitation when the dimensionality of the feature space is large.
In PASS, only the class means are memorized, and all the class means are augmented with the same radius of $r$, which is different from GFR. Secondly, since we do not update the saved class means, the prototypes would be not that representative when updating the feature extractor during incremental learning process. To deal with this problem, we use SSL to learn generic and transferable representation, which can reduce the representation shift and maintain the effectiveness of old prototypes. Therefore, SSL is complementary with prototype augmentation in our method, and their combination has shown to be effective.
The using of prototypes of PASS takes inspiration CPN [Yang et al., CVPR 2018, TPAMI 2020] and SDC [Yu et al., CVPR 2020]. While we find it difficult to train the embedding network in SDC then, we focus on how to leverage prototypes in softmax-based network. With your kindly reminding, we notice that the ``Ours Gaussian'' in GFR and our method share similar motivation in terms of leveraging prototypes for CIL, while there are also differences. Thank you for your reminding.
Is PASS similar to the comparison method (ours Gaussian) in the GFR (generative feature replay) work, although SSL is not used in GFR. Many thanks.