EmilyAlsentzer / clinicalBERT

repository for Publicly Available Clinical BERT Embeddings
MIT License
673 stars 135 forks source link

Known issue with section splitting in heuristic_tokenize.py #4

Open EmilyAlsentzer opened 5 years ago

EmilyAlsentzer commented 5 years ago

There are two bugs in the sent_tokenize_rules function in heuristic_tokenize.py

We have not fixed them in this repo because we want to maintain the reproducibility of our code at the time the work was published. However, anyone wanting to extend this work should make the following changes in heuristic_tokenize.py:

  1. fix a bug on line #168 where . should be replaced with \. i.e. should be while re.search('\n\s*%d\.'%n,segment):
  2. add else statement (else: new_segments.append(segments[i])) to the if statement at line 287 if (i == N-1) or is_title(segments[i+1]): This fixes a bug where lists that have a title header will lose their first entry.