Open PonteIneptique opened 4 years ago
Because models were not trained with -t- on the pronoun but on the verb, we normalize the input
iterator.tokenizer.replacer = lambda x: x # Until model is fixed
replacer
Because models were not trained with -t- on the pronoun but on the verb, we normalize the input
iterator.tokenizer.replacer = lambda x: x # Until model is fixed
in test should be removed.replacer
should be clean`