ufal / udpipe

UDPipe: Trainable pipeline for tokenizing, tagging, lemmatizing and parsing Universal Treebanks and other CoNLL-U files
Mozilla Public License 2.0
359 stars 75 forks source link

Error "Should encode value 65536 in one byte" #118

Open svetlana21 opened 4 years ago

svetlana21 commented 4 years ago

Hello! I stumbled upon this error during tagger training on some part of Taiga corpus of Russian language (~1 Gb of texts): "An error occurred during model training: Should encode value 65536 in one byte!"

The quick question is: does udpipe have some vocabulary size limitations?

The full story is: I know about this issue https://github.com/ufal/udpipe/issues/53 and I tried everything written there (I don't have tokens with length > 255 bytes, don't have dubious lemmas - max number of forms for one lemma in my corpus is 158 because of rich morphology of language, set guesser_enrich_dictionary to 1). I also removed all sentences with length more than 255 tokens. But I still get this error. The only thing helped - to reduce corpus size to ~750 Mb. Size ~800 Mb still causes the error. I guessed the problem is in specific sentences (the difference of these two corpora). Ok, I tried to train tagger on corpora diff and didn't get the error. So, does udpipe have some vocabulary size limitations? Or may be there is some less obvious cause of problem?

foxik commented 4 years ago

UDPipe has unfortunately several limitations, stemming from the fact that we use a morphological library for representing morphological vocabulary -- and if such an error occurs, you do not get an informative message. You however tried most of the known problems, so I am not sure what the problem could be. If the texts are public, you can send me a download link and I can look.

The new version being prepared will not have any such limitations, if it is any consolation ;-)

jwijffels commented 4 years ago

@svetlana21 Might be the same error as I had https://github.com/ufal/udpipe/issues/130 After some debugging it appeared I had a lemma with too many possible word forms.