smilli / berkeleylm

Automatically exported from code.google.com/p/berkeleylm
1 stars 1 forks source link

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException #4

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Hello, now I want to build a chinese language model from an arpa file, However, 
it fails as following:
        Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 8641
    at edu.berkeley.nlp.lm.map.ImplicitWordHashMap.setWordRanges(ImplicitWordHashMap.java:84)
    at edu.berkeley.nlp.lm.map.ImplicitWordHashMap.<init>(ImplicitWordHashMap.java:52)
    at edu.berkeley.nlp.lm.map.HashNgramMap.<init>(HashNgramMap.java:66)
    at edu.berkeley.nlp.lm.map.HashNgramMap.createImplicitWordHashNgramMap(HashNgramMap.java:49)
    at edu.berkeley.nlp.lm.io.LmReaders.createNgramMap(LmReaders.java:473)
    at edu.berkeley.nlp.lm.io.LmReaders.buildMapCommon(LmReaders.java:439)
    at edu.berkeley.nlp.lm.io.LmReaders.buildMapArpa(LmReaders.java:419)
    at edu.berkeley.nlp.lm.io.LmReaders.secondPassArrayEncoded(LmReaders.java:383)
    at edu.berkeley.nlp.lm.io.LmReaders.readArrayEncodedLmFromArpa(LmReaders.java:160)

but when a use a smaller file, it is ok. is there any argument size need to 
ajdust? 

Original issue reported on code.google.com by shyang...@gmail.com on 4 Oct 2011 at 3:48

GoogleCodeExporter commented 9 years ago
Would it be possible to send me the file that is causing this bug, or is it to 
big? 

Original comment by adpa...@gmail.com on 4 Oct 2011 at 4:21

GoogleCodeExporter commented 9 years ago
Apparently the user has resolved this bug on their own.

Original comment by adpa...@gmail.com on 6 Oct 2011 at 5:24