Closed ledsoft closed 1 day ago
StanfordNLP which we use for tokenization and other NLP tasks has high memory requirements. We need to optimize it or replace it with a less memory-intensive alternative.
Disabled some of the annotators. Memory footprint has been significantly reduced without affecting the functionality.
StanfordNLP which we use for tokenization and other NLP tasks has high memory requirements. We need to optimize it or replace it with a less memory-intensive alternative.