Closed mohamad-hasan-sohan-ajini closed 6 years ago
compilation issue solved by:
#ifndef KENLM_MAX_ORDER
#define KENLM_MAX_ORDER = 10
#endif
but I still get the error:
/home/sobhe/kenlm/lm/model.cc:49 in void lm::ngram::detail::{anonymous}::CheckCounts(const std::vector
&) threw FormatLoadException because counts.size() > 6'. This model has order 10 but KenLM was compiled to support up to 6. If your build system supports changing KENLM_MAX_ORDER, change it there and recompile. In the KenLM tarball or Moses, use e.g.
bjam --max-kenlm-order=6 -a'. Otherwise, edit lm/max_order.hh. Byte: 146 ERROR
ouchhhh
The KENLM_MAX_ORDER
is set to 6 in lm/CMakeLists.txt
. Setting it to 10 and compiling solved the issue.
The problem persist when I want to load 10 gram model in python. Any suggestion?
Traceback (most recent call last): File "chars/test.py", line 32, in
model = AzBarChars('resources/chars.klm') File "/home/sobhe/azbar/chars/language.py", line 10, in init self.model = kenlm.LanguageModel(klm_file) File "kenlm.pyx", line 117, in kenlm.Model.init (python/kenlm.cpp:2656) IOError: Cannot read model 'resources/chars.klm' (lm/model.cc:49 in void lm::ngram::detail::{anonymous}::CheckCounts(const std::vector &) threw FormatLoadException because counts.size() > 6'. This model has order 10 but KenLM was compiled to support up to 6. If your build system supports changing KENLM_MAX_ORDER, change it there and recompile. In the KenLM tarball or Moses, use e.g.
bjam --max-kenlm-order=6 -a'. Otherwise, edit lm/max_order.hh.)
The problem persist when I want to load 10 gram model in python. Any suggestion?
Traceback (most recent call last): File "chars/test.py", line 32, in
model = AzBarChars('resources/chars.klm') File "/home/sobhe/azbar/chars/language.py", line 10, in init self.model = kenlm.LanguageModel(klm_file) File "kenlm.pyx", line 117, in kenlm.Model.init (python/kenlm.cpp:2656) IOError: Cannot read model 'resources/chars.klm' (lm/model.cc:49 in void lm::ngram::detail::{anonymous}::CheckCounts(const std::vector &) threw FormatLoadException because counts.size() > 6'. This model has order 10 but KenLM was compiled to support up to 6. If your build system supports changing KENLM_MAX_ORDER, change it there and recompile. In the KenLM tarball or Moses, use e.g.
bjam --max-kenlm-order=6 -a'. Otherwise, edit lm/max_order.hh.)
In the case of the python module, I presume you compiled it with setup.py. Edit line 21 of setup.py and reinstall.
Hi, Sorry, I got this error today, when I was converting arpa file to bin file for gram=7, but after I tried your methods, it still cannot work:
/home/mark/dependency/kenlm/lm/model.cc:49 in void lm::ngram::detail::{anonymous}::CheckCounts(const std::vector<long unsigned int>&) threw FormatLoadException because counts.size() > 6.This model has order 7 but KenLM was compiled to support up to 6. If your build system supports changing KENLM_MAX_ORDER, change it there and recompile. With cmake: cmake -DKENLM_MAX_ORDER=10 .. With Moses: bjam --max-kenlm-order=10 -a Otherwise, edit lm/max_order.hh. Byte: 219 ERROR
I also tried cmake -DKENLM_MAX_ORDER=10 ..
under kenlm/build
folder, get the following response:
-- Boost version: 1.65.1 -- Found the following Boost libraries: -- program_options -- system -- thread -- unit_test_framework -- chrono -- date_time -- atomic -- Configuring done -- Generating done -- Build files have been written to: /home/mark/dependency/kenlm/build
Then I tried to conversion again from arpa to bin, it still responsed the same error like the first one.
Do you know how to figure it out? Thank you very much!
Hi I need to compile kenlm to train a 10-gram language model. As recommended in readme, I add the following
ifndef
statement tokenlm/utils/have.hh
:but I get the following error in compilation:
Any Idea how to solve the error?