alpheios-project / tokenizer

Alpheios Tokenizer Service
1 stars 0 forks source link

Spacy update #41

Closed irina060981 closed 3 years ago

irina060981 commented 3 years ago

For the https://github.com/alpheios-project/tokenizer/issues/40

Language.Defaults was changed in the new version that's why I deleted

    lex_attr_getters[NORM] = add_lookups(
        Language.Defaults.lex_attr_getters[NORM], BASE_NORMS
    )

because Language.Defaults.lex_attr_getters is empty in new version