Closed mmiakashs closed 3 years ago
Thanks for reporting the issue. We are looking into it.
hi,
I think this is happening because there is a typo in the tutorial in the experiment config:
Instead of trying
in model_config
, it should be concat_bert
.
Can you try with this experiment config:
includes:
- configs/datasets/hateful_memes/bert.yaml
model_config:
concat_bert:
classifier:
type: mlp
params:
num_layers: 2
losses:
- type: cross_entropy
scheduler:
type: warmup_linear
params:
num_warmup_steps: 2000
num_training_steps: ${training.max_updates}
optimizer:
type: adam_w
params:
lr: 5e-5
eps: 1e-8
evaluation:
metrics:
- accuracy
- binary_f1
- roc_auc
training:
batch_size: 64
lr_scheduler: true
max_updates: 22000
early_stop:
criteria: hateful_memes/roc_auc
minimize: false
We will open up a PR to fix this in docs.
@apsdehal Thanks it is working perfectly now. Sorry, I should have to notice that.
I was trying to train the following tutorial ConcatBERT model: https://mmf.sh/docs/tutorials/concat_bert However, I am getting the following errors. could anyone please let me know am I missing anything?