ByungKwanLee / MoAI

[ECCV 2024] Official PyTorch implementation code for realizing the technical part of Mixture of All Intelligence (MoAI) to improve performance of numerous zero-shot vision language tasks.
MIT License
305 stars 31 forks source link

About the 'Word Embed', how did you get it? #15

Open cassiaaaaaa opened 5 months ago

cassiaaaaaa commented 5 months ago

Thanks for sharing your great work! I have a question about the paper. As said in the paper, "'Word Embed' represents the word embedding dictionary of MLM". It seems to be a fixed module. Where did you get the module weights of Word Embed? From other MLM model? How to find it?

ByungKwanLee commented 5 months ago

All Language models have word embedding. We used it in backbone mlm model.