Hi, I have problem with smoothing large files of 3-grams counts.
I use :estimate-ngram -order 3 -counts allgrams -smoothing FixModKN -wl
allgrams.FixModKN.lm command and i get this error:
Saving LM to train.corpus.lm...
estimate-ngram: src/NgramModel.cpp:422: void NgramModel::SaveLM(const
std::vector<DenseVector<double>, std::allocator<DenseVector<double> > >&, const
std::vector<DenseVector<double>, std::allocator<DenseVector<double> > >&,
ZFile&) const: Assertion `(size_t)(ptr - lineBuffer.data()) <
lineBuffer.size()' failed.
Before I tried on 2-grams with 4,7GB files and it works fine. 3-grams file is
20GB big.
My operating system is GNU/Linux x86_64 with 96GB RAM
Original issue reported on code.google.com by Roksana....@gmail.com on 15 Nov 2012 at 4:48
Original issue reported on code.google.com by
Roksana....@gmail.com
on 15 Nov 2012 at 4:48