zopefoundation / zodbpickle

Fork of Python's pickle module to work with ZODB
Other
17 stars 15 forks source link

ImportError: No module named pythainlp.tokenize.multi_cut #45

Closed loretoparisi closed 5 years ago

loretoparisi commented 5 years ago

I get this error while running:

Traceback (most recent call last):
  File "dictionary.py", line 13, in <module>
    w2w = Word2word(codes[0], codes[1], dict_path=dict_path)
  File "/word2word/word2word.py", line 12, in __init__
    self.word2x, self.y2word, self.x2ys = download_or_load(lang1, lang2, dict_path)
  File "/word2word/word2word/utils.py", line 52, in download_or_load
    word2x, y2word, x2ys = pickle.load(open(fpath, 'rb'))
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1525, in load
    return Unpickler(file).load()
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 881, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1142, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1176, in find_class
    __import__(module)
ImportError: No module named pythainlp.tokenize.multi_cut

As suggested, I have added pythainlp with pip install pythainlp -U, but then I get

[nltk_data] Downloading package wordnet to
[nltk_data]     /Users/loretoparisi/nltk_data...
[nltk_data]   Unzipping corpora/wordnet.zip.
[nltk_data] Downloading package omw to
[nltk_data]     /Users/loretoparisi/nltk_data...
[nltk_data]   Unzipping corpora/omw.zip.
Traceback (most recent call last):
  File "dictionary.py", line 13, in <module>
    w2w = Word2word(codes[0], codes[1], dict_path=dict_path)
  File "/word2word/word2word/word2word.py", line 12, in __init__
    self.word2x, self.y2word, self.x2ys = download_or_load(lang1, lang2, dict_path)
  File "/word2word/word2word/utils.py", line 52, in download_or_load
    word2x, y2word, x2ys = pickle.load(open(fpath, 'rb'))
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1525, in load
    return Unpickler(file).load()
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 881, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1142, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/site-packages/zodbpickle/pickle_2.py", line 1176, in find_class
    __import__(module)
ImportError: No module named multi_cut
jamadden commented 5 years ago

Thanks for the report. This is still most likely to be an issue with the environment or the data. For example, perhaps the updated pythainlp project is now a different version that doesn't work with the pickled data (because it no longer includes the multi_cut module).

The only reason this would be an issue with zodbpickle is if the standard library pickle module can successfully unpickle this data, but zodbpickle cannot (in the same environment).

loretoparisi commented 5 years ago

@jamadden from the last run I can see that after installing pythainlp, I had the following logging:

[nltk_data] Downloading package wordnet to
[nltk_data]     /Users/loretoparisi/nltk_data...
[nltk_data]   Unzipping corpora/wordnet.zip.
[nltk_data] Downloading package omw to
[nltk_data]     /Users/loretoparisi/nltk_data...
[nltk_data]   Unzipping corpora/omw.zip.

and then the same error, or better without pythainlp I had ImportError: No module named pythainlp.tokenize.multi_cut, while with the module I had the latter ImportError: No module named multi_cut. Not sure what this means.

jamadden commented 5 years ago

It probably means that pythonainlp.tokenize exists (now that you installed it), but that pythonainlp.tokenize.multi_cut does not exist. But the pickled file refers to that missing module---hence the guess about an incompatible version.

At any rate, this doesn't appear to be zodbpickle related, most likely the standard library pickle would have the same issues. I would suggest discussing this with the maintainers of that library or data file.

loretoparisi commented 5 years ago

Got it thank you.