Closed chelsea closed 9 years ago
Looks like it works running with:
jruby -J-Xmx2048m app.rb
I was able to get the following error which makes it look like the jvm options that I'm manually setting, and the default ones from the gem are not being respected in my setup:
Error: Your application used more memory than the safety cap of 500M.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace
Exporting JAVA_OPTS to the env seems to take care of this:
web: export JAVA_OPTS=-"Xmx3g"; bundle exec #{run your app}
Sorry for necroposting. Google shows this thread on the top for request "stanford core nlp GC overhead limit exceeded". The example from the Stanford Core NLP webpage is misconfigured.
java -cp "*" -Xmx2g edu.stanford.nlp.pipeline.StanfordCoreNLP -annotators tokenize,ssplit,pos,lemma,ner,parse,dcoref -file input.txt
has an option -Xmx2g
which approximately means "use 2GB of RAM`. Increasing it to 4 GB solved the issue.
I'm just trying to get all my config set up properly, but can't get the example code to run.
In my app, which is a simple little Sinatra app with a single endpoint that just executes the following:
And when the method is invoked, I get the following:
I'm using the latest versions of Stanford NLP (3.5.0) downloaded into a
lib
directory, loaded with the following, the latest tagger version is also inlib/taggers
:And the following Java and jruby set up:
I've tried playing around with the
StanfordCoreNLP.jvm_args
option, bumping it up to as much as 4GB but still have the same issue.Any ideas how to get this sorted?