When processing a big text file (>= 1GB text), there might be a case of connection error that unfortunately stops the program. The following example might be used to fix this error:
for _ in range(5):
try:
# Process the data:
annotated_text = annotator.annotate(text)
word_segmented_text = annotator.tokenize(text)
break
except:
# Reconnect if there is a connection error:
print("Retry in 5 seconds")
time.sleep(5)
When processing a big text file (>= 1GB text), there might be a case of connection error that unfortunately stops the program. The following example might be used to fix this error: