Infer information from Tweets. Useful for human-centered computing tasks, such as sentiment analysis, location prediction, authorship profiling and more!
While developing or debugging it is essential that the web server be able to restart quickly, as it is able to now by having the classifier not being loaded locally. However, in production the initial start time doesn't hurt as much, and we can improve performance by around 5x to 20x alone through loading the classifier locally.
Therefore, we need the ability to run the server in debug mode with quick loading / restart ability, but also need to just as easily run the server in production mode.
While developing or debugging it is essential that the web server be able to restart quickly, as it is able to now by having the classifier not being loaded locally. However, in production the initial start time doesn't hurt as much, and we can improve performance by around 5x to 20x alone through loading the classifier locally.
Therefore, we need the ability to run the server in debug mode with quick loading / restart ability, but also need to just as easily run the server in production mode.