Closed Kouuh closed 3 years ago
Thanks for reporting this! I'm not sure I understand the context; could you share some code (that can be run preferably on Colab) that reproduces the problem? And have you tried setting the environment variable in bash:
export TOKENIZERS_PARALLELISM=true
Or inside python:
import os
import dl_translate as dlt
os.environ['TOKENIZERS_PARALLELISM'] = True # or, "False"
...
mt = dlt.TranslationModel()
...
Thank you!
When I execute os.environ['TOKENIZERS_PARALLELISM'] = True, it can run.
When I use dl_translate, the following problem appears, how do I set TOKENIZERS_PARALLELISM.
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either:
tokenizers
before the fork if possible