emorynlp / nlp4j-tokenization

Tokenize raw texts into tokens and sentences.
Other
6 stars 4 forks source link

Tokenization

Our tokenizer takes a raw text and splits tokens by their morphological aspects. It also groups tokens into sentences. Our tokenizer is based on the LDC tokenizer used for creating English Treebanks although it uses more robust heuristics. Here are some key features about our tokenizer.

API

TokenizerDemo shows how the tokenizer can be used in APIs.