LTH14 / mage

A PyTorch implementation of MAGE: MAsked Generative Encoder to Unify Representation Learning and Image Synthesis
MIT License
507 stars 26 forks source link

The code for contrastive leanring is included in the current codebase or not? #5

Open gaopengpjlab opened 1 year ago

gaopengpjlab commented 1 year ago

As the title says, I can not find the contrastive learning loss in your codebase.

LTH14 commented 1 year ago

I just updated the README to include pre-trained MAGE-C checkpoints converted from JAX. We are still wrapping up the PyTorch code for the contrastive loss. We will endeavor to release that part soon.

gaopengpjlab commented 1 year ago

Can you provide the performance comparison between models trained by MIM and MIM+CL?

LTH14 commented 1 year ago

Please refer to this paper for that comparison. We also have the comparison between MAGE and MAGE-C in Table 2 and Table 13 in our paper.

densechen commented 2 months ago

@LTH14 Is MAGE-C ready?