Closed congchan closed 4 years ago
The config hierarchy, The config user passed by command line config=/mmf/projects/visual_bert/configs/hateful_memes/from_coco.yaml is
config=/mmf/projects/visual_bert/configs/hateful_memes/from_coco.yaml
includes: - ./defaults.yaml checkpoint: resume_pretrained: true resume_zoo: visual_bert.pretrained.coco
which includes ./defaults.yaml contains:
./defaults.yaml
dataset_config: hateful_memes: return_features_info: true processors: text_processor: type: bert_tokenizer params: tokenizer_config: type: bert-base-uncased params: do_lower_case: true mask_probability: 0 max_seq_length: 128
I expected mmf would override the config and call BertTokenizer.
BertTokenizer
However the program still called the vocab type processor VocabProcessor, which was defined in mmf/configs/datasets/hateful_memes/defaults.yaml
vocab
VocabProcessor
mmf/configs/datasets/hateful_memes/defaults.yaml
mmf_run config=projects/hateful_memes/configs/visual_bert/from_coco.yaml \ model=visual_bert \ dataset=hateful_memes
Steps to reproduce the behavior:
Please copy and paste the output from the environment collection script from PyTorch (or fill out the checklist below manually).
You can run the script with:
# For security purposes, please check the contents of collect_env.py before running it. python -m torch.utils.collect_env
conda
pip
I just ran the same command and I don't see this issue. Can you provide any logs that shows BertTokenizer is not getting called?
I fix it, forgot to specify the tokenizer_config, thanks anyway
🐛 Bug
The config hierarchy, The config user passed by command line
config=/mmf/projects/visual_bert/configs/hateful_memes/from_coco.yaml
iswhich includes
./defaults.yaml
contains:I expected mmf would override the config and call
BertTokenizer
.However the program still called the
vocab
type processorVocabProcessor
, which was defined inmmf/configs/datasets/hateful_memes/defaults.yaml
Command
To Reproduce
Steps to reproduce the behavior:
Expected behavior
I expected mmf would override the config and call
BertTokenizer
.Environment
Please copy and paste the output from the environment collection script from PyTorch (or fill out the checklist below manually).
You can run the script with:
conda
,pip
, source):Additional context