linjieli222 / HERO

Research code for EMNLP 2020 paper "HERO: Hierarchical Encoder for Video+Language Omni-representation Pre-training"
https://arxiv.org/abs/2005.00200
MIT License
230 stars 34 forks source link

One issue about pretrain-tv-init.bin file #16

Closed Unified-Robots closed 3 years ago

Unified-Robots commented 3 years ago

I'm training the HERO model based on the provided TV dataset. The model needs to pre-load "pretrain-tv-init.bin" to initialize the network. However, I do not know how to obtain this file by ourselves, and I modified the code to ignore this file. But after the pre-training process is finished, the performance is really poor. Is "pretrain-tv-init.bin" essential? If so, how can we obtain this file from scratch?

linjieli222 commented 3 years ago

https://github.com/linjieli222/HERO/blob/faaf15d6ccc3aa4accd24643d77d75699e9d7fae/scripts/download_tv_pretrain.sh#L37

Unified-Robots commented 3 years ago

@linjieli222

https://github.com/linjieli222/HERO/blob/faaf15d6ccc3aa4accd24643d77d75699e9d7fae/scripts/download_tv_pretrain.sh#L37

I know you provide this file. But I want to know what should we do to obtain this file totally by ourselves?

linjieli222 commented 3 years ago

We obtained the RoBERTa pre-trained weights on TVR dataset from the original authors.

linjieli222 commented 3 years ago

Closed due to inactivity.