cbaziotis / ntua-slp-semeval2018

Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
MIT License
83 stars 30 forks source link

Error with load_embeddings.py "can't fix" #13

Open HussainYa opened 4 years ago

HussainYa commented 4 years ago

Kindly, help me to solve the problem with this error which I receive when doing the steps to run python -m model.pretraining.sentiment2017

File "/utils/load_embeddings.py", line 93, in load_word_vectors embeddings = numpy.array(embeddings, dtype=numpy.float32) ValueError: setting an array element with a sequence.

Thanks

HussainYa commented 4 years ago

for the embedding am using ntua_twitter_300.txt.

"embeddings_file": "ntua_twitter_300"

HussainYa commented 4 years ago

$ python -m model.pretraining.sentiment2017 Running on:cpu loading word embeddings... Didn't find embeddings cache file ntua/embeddings/ntua_twitter_300.txt Indexing file ntua/embeddings/ntua_twitter_300.txt ... {300, 310} Found 804871 word vectors. Traceback (most recent call last): File "opt/anaconda3/lib/python3.7/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "opt/anaconda3/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "ntua/model/pretraining/sentiment2017.py", line 41, in monitor="val", label_transformer=transformer) File "ntua/utils/train.py", line 276, in define_trainer word2idx, idx2word, embeddings = load_embeddings(_config) File "ntua/utils/train.py", line 109, in load_embeddings return load_word_vectors(word_vectors, word_vectors_size) File "ntua/utils/load_embeddings.py", line 93, in load_word_vectors embeddings = numpy.array(embeddings, dtype='float32') ValueError: setting an array element with a sequence.

dennis287 commented 4 years ago

Is there a fix for this? I am currently facing exactly the same issue.

Thanks a lot!

dennis287 commented 4 years ago

I found out the problem myself. The reason why I got this error was because of the fact that I was using a newer version of numpy than v1.13.1, which is listed in requirements.txt. I hope this helps!