stanfordnlp / CoreNLP

CoreNLP: A Java suite of core NLP tools for tokenization, sentence segmentation, NER, parsing, coreference, sentiment analysis, etc.
http://stanfordnlp.github.io/CoreNLP/
GNU General Public License v3.0
9.69k stars 2.7k forks source link

Online demo is down #1398

Closed alpgarcia closed 9 months ago

alpgarcia commented 11 months ago

Hi,

Not sure if this is the right place for this. Please let me know where I should post it instead if needed.

The online demo seems to be down since some days ago (at least). Each time I tried, I got a timeout.

Thanks for your help, Alberto.

AngledLuffa commented 11 months ago

I've already restarted it once today. Either someone is hammering it, the machine is toast, or there's some bug which I don't know about. I suggest running it locally for the time being

On Wed, Nov 22, 2023 at 12:20 AM alpgarcia @.***> wrote:

Hi,

Not sure if this is the right place for this. Please let me know where I should post it instead if needed.

The online demo https://corenlp.run/ seems to be down since some days ago (at least). Each time I tried, I got a timeout.

Thanks for your help, Alberto.

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/CoreNLP/issues/1398, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2AYWIQQ5QIAMU76XHSIKTYFWYTFAVCNFSM6AAAAAA7V2AX4GVHI2DSMVQWIX3LMV43ASLTON2WKOZSGAYDKOBQGU4TKMI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

melandresen commented 11 months ago

Can you say anything about whether the demo will be back online at the same URL in the long run? I am referring to the URL in a book that will be printed in January or February, and I would like to avoid it being immediately outdated, if possible. Thank you very much!

AngledLuffa commented 11 months ago

It will. I just haven't thought about how to prevent the kind of long queries which were crashing it. You can thank whoever did that for making it not available currently

On Tue, Nov 28, 2023 at 7:42 AM Melanie Andresen @.***> wrote:

Can you say anything about whether the demo will be back online at the same URL in the long run? I am referring to the URL in a book that will be printed in January or February, and I would like to avoid it being immediately outdated, if possible. Thank you very much!

— Reply to this email directly, view it on GitHub https://github.com/stanfordnlp/CoreNLP/issues/1398#issuecomment-1830119239, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2AYWL3ZQ55QIQHP25LLADYGYA5ZAVCNFSM6AAAAAA7V2AX4GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMZQGEYTSMRTHE . You are receiving this because you commented.Message ID: @.***>

melandresen commented 11 months ago

Great, glad to hear that! Thank you!

mkarmona commented 11 months ago

@melandresen @alpgarcia to easily run locally you can use this script

coursier launch --extra-jars stanford-corenlp-4.5.5-models-english.jar --extra-jars stanford-corenlp-4.5.5-models.jar org.slf4j:slf4j-simple:2.0.7  edu.stanford.nlp:stanford-corenlp:4.5.5 --main-class  edu.stanford.nlp.pipeline.StanfordCoreNLPServer -- -port 9000 -timeout 15000 -annotators "tokenize, ssplit, pos, lemma, depparse, natlog"

The dependencies you need are

Does anyone know how to instruct Coursier to also fetch maven classifiers?

alpgarcia commented 11 months ago

Thanks @mkarmona!

I have the pending task of trying this: https://github.com/stanfordnlp/CoreNLP/issues/1356#issuecomment-1555532571

AngledLuffa commented 11 months ago

In general, if you can have your users download external resources, I don't think you need to download all of those pieces yourself. I could be wrong, though.

For now, I've put the demo back up with a much lower max query length. If it crashes again, I'll look for other causes. In the meantime, I do recommend that if people are using the demo for more than just "let's see if CoreNLP can do this thing before we install it and start using it locally", the best approach is to install it and start using it locally. We just don't have the server capacity to support everyone who might want to use it otherwise.