Open kleag opened 5 months ago
Temporary solution in b462061 by creating a new instance of the tokenizer on each file. This is not too costly because tokenization models are very small but it cannot be counted as a correction. And in fact, now the tagger crashes on second file.
Describe the bug When deeplima is asked to analyze several files, the first one is analyzed correctly, but then the program stalls.
To Reproduce Steps to reproduce the behavior:
deeplima --tok-model ~/.local/share/lima/resources/RnnTokenizer/ud/tokenizer-eng-UD_English-EWT.pt test-eng*.txt
segmentation_impl.cpp
here: https://github.com/aymara/lima/blob/7a33899a4c731bacd7aa628d4750bca1c5160a8c/deeplima/libs/tasks/segmentation/inference/segmentation_impl.cpp#L156Expected behavior All files should be analyzed successfully.