Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/
The instance creation is done at the module level. And I didn't find the usage of this instance in other places. I guess the reason of this statement is to avoid creating a uregex every time (which iterate over the whole unicode space). But having this variable here slow down the texar import.
A simple test of time (start=timeit.default_timer(); import texar.torch; print(timeit.default_timer()-start))
I found that the
bleu_transformer
class is called simply whentexar-pytorch
is imported, which is likely due to the following line:https://github.com/asyml/texar-pytorch/blob/507932c899ca3a8663479b31efc3a41bc7180693/texar/torch/evals/bleu_transformer.py#L159
The instance creation is done at the module level. And I didn't find the usage of this instance in other places. I guess the reason of this statement is to avoid creating a
uregex
every time (which iterate over the whole unicode space). But having this variable here slow down the texar import.A simple test of time (
start=timeit.default_timer(); import texar.torch; print(timeit.default_timer()-start)
)uregex
: 6.359622538Thus this line somehow double the import speed.