facebookresearch / mmbt

Supervised Multimodal Bitransformers for Classifying Images and Text
Other
243 stars 52 forks source link

Pretrained models? #5

Closed harkiratbehl closed 3 years ago

harkiratbehl commented 3 years ago

Hi,

Can you please provide pretrained models for the different models/baselines used in the paper?

douwekiela commented 3 years ago

Hi, mmbt is not a pretraining architecture - the underlying pretrained models are eg just BERT and ResNet. If you're asking for the models finetuned on the individual tasks, we don't have those lying around anymore unfortunately. It should be relatively straightforward to train these though with the scripts in this repository (or check out the much better MMF codebase - https://github.com/facebookresearch/mmf) since the datasets are not super huge.