dhanushanthp / jatetoolkit

Automatically exported from code.google.com/p/jatetoolkit
0 stars 1 forks source link

Out of memory problem #1

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?

java -Xmx1024m -classpath 
/Users/sarnobat/trash/jatetoolkit-read-only/dist/:/Users/sarnobat/trash/jatetool
kit-read-only/libs/apache-log4j-1.2.15/log4j-1.2.15.jar:/Users/sarnobat/trash/ja
tetoolkit-read-only/libs/apache-opennlp-1.51/jwnl-1.3.3.jar:/Users/sarnobat/tras
h/jatetoolkit-read-only/libs/apache-opennlp-1.51/opennlp-maxent-3.0.1-incubating
.jar:/Users/sarnobat/trash/jatetoolkit-read-only/libs/apache-opennlp-1.51/opennl
p-tools-1.5.1-incubating.jar:/Users/sarnobat/trash/jatetoolkit-read-only/libs/dr
agon/dragontool.jar:/Users/sarnobat/trash/jatetoolkit-read-only/libs/hsqldb2.2.3
/hsqldb.jar:/Users/sarnobat/trash/jatetoolkit-read-only/libs/hsqldb2.2.3/sqltool
.jar:/Users/sarnobat/trash/jatetoolkit-read-only/libs/wit-commons/wit-commons.ja
r: uk.ac.shef.dcs.oak.jate.test.AlgorithmTester 
/Users/sarnobat/trash/jatetoolkit-read-only/nlp_resources/  test/example/ 
test/output

What is the expected output? What do you see instead?
don't know

What version of the product are you using? On what operating system?
trunk

Please provide any additional information below.

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:2734)
        at java.util.ArrayList.ensureCapacity(ArrayList.java:167)
        at java.util.ArrayList.add(ArrayList.java:351)
        at uk.ac.shef.dcs.oak.jate.core.npextractor.NGramExtractor.getNGram(NGramExtractor.java:123)
        at uk.ac.shef.dcs.oak.jate.core.npextractor.NGramExtractor.extract(NGramExtractor.java:67)
        at uk.ac.shef.dcs.oak.jate.core.npextractor.NGramExtractor.extract(NGramExtractor.java:49)
        at uk.ac.shef.dcs.oak.jate.core.feature.indexer.GlobalIndexBuilderMem.build(GlobalIndexBuilderMem.java:53)
        at uk.ac.shef.dcs.oak.jate.test.AlgorithmTester.main(AlgorithmTester.java:83)

Original issue reported on code.google.com by ss401...@gmail.com on 12 Jun 2012 at 4:46

GoogleCodeExporter commented 9 years ago
Hi there

This is likely due to the larger number of candidate terms extracted by n-gram 
- perhaps 1G memory isn't enough. Can you try one thing:

In AlgorithmTester, lines 70-76 are:
------------------
//Three CandidateTermExtractor are implemented:
                //1. An OpenNLP noun phrase extractor that extracts noun phrases as candidate terms
                //CandidateTermExtractor npextractor = new NounPhraseExtractorOpenNLP(stop, lemmatizer);
                //2. A generic N-gram extractor that extracts n(default is 5, see the property file) grams
                CandidateTermExtractor npextractor = new NGramExtractor(stop, lemmatizer);
                //3. A word extractor that extracts single words as candidate terms.
                //CandidateTermExtractor wordextractor = new WordExtractor(stop, lemmatizer);
------------------

Disable the npextractor but use the noun phrase extractor, i.e., option 1.

If that fixes the problem, it should be the problem of allocated memory.

Original comment by ziqizhan...@googlemail.com on 12 Jun 2012 at 5:44

GoogleCodeExporter commented 9 years ago
Issue closed

Original comment by ziqizhan...@googlemail.com on 25 Jul 2013 at 10:05