Closed suryapa1 closed 4 years ago
Why this functionality is disabled: UserWarning: use_multiprocessing automatically disabled as xlmroberta fails when using multiprocessing for feature conversion. f"use_multiprocessing automatically disabled as {model_type}"
The warning states why it's disabled. Because using multiprocessing generates an error with xlmroberta for some reason.
The training time depends on a lot of factors like the size of the model, the type of the model, the GPU you are using, etc. So it's impossible for me to say why it's taking 20+ hours or even whether that's unexpected.
Model am using is xlm-Roberta-base with 17000 documents for now with 17 labels , gpu is p2x,16gb gpu memory , any suggestion for me on mode type of xlm-bert as model has to support english and German language
Does xlm-Roberta supports TPU, If so how to enable tpu on the same
Do you have very long documents? 17000 sample dataset shouldn't take that long to train.
Simple Transformers doesn't have native TPU support yet.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Describe the bug
Even for 100 documents training is taking more than expected time like 20+hrs for xlm-roberta-large as model_type, Is there any workaround for this to work