Key-Value Memory Networks for Directly Reading Documents, Alexander Miller, Adam Fisch, Jesse Dodge, Amir-Hossein Karimi, Antoine Bordes, Jason Weston https://arxiv.org/abs/1606.03126
Just putting this out there in case anyone knows what's going on, or is running into something similar.
In multiple, seemingly independent places, I get "too many values to unpack" errors, e.g.
➜ key-value-memory-networks git:(master) ✗ python3 interactive.py -m model_memnn_kv.h5
Using TensorFlow backend.
/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility. Expected 96, got 88
return f(*args, **kwds)
Namespace(max_mem_size=100, max_query_len=16, model='model_memnn_kv.h5')
load data...
2018-07-23 23:32:18.557243: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
load pickle/mov_vocab.pickle
load pickle/mov_w2i.pickle
load pickle/mov_i2w.pickle
load pickle/mov_w2i_label.pickle
load pickle/mov_i2w_label.pickle
load pickle/mov_kv_pairs.pickle
load pickle/mov_stopwords.pickle
Question: who directed dancing with wolves
q_tokens: ['who', 'directed', 'dancing', 'with', 'wolves']
--- 1 268582
Traceback (most recent call last):
File "interactive.py", line 71, in <module>
predict(q)
File "interactive.py", line 60, in predict
data_k, data_v = load_kv_dataset([(None,q_tokens,None)], kv_pairs, stopwords)
File "/Users/samhavens/key-value-memory-networks/process_data.py", line 213, in load_kv_dataset
for i, (q, _) in enumerate(data):
ValueError: too many values to unpack (expected 2)
This also happened in the notebook, in the second large code block, the line
query_maxlen = max(map(len, (x for _, x, _ in train_data + test_data)))
For the second example, removing the first underscore seemed to work. I don't know if the data changed shape, or what, but it seemed like train_data is an array of tuples with 2 elements.
Just putting this out there in case anyone knows what's going on, or is running into something similar.
In multiple, seemingly independent places, I get "too many values to unpack" errors, e.g.
This also happened in the notebook, in the second large code block, the line
For the second example, removing the first underscore seemed to work. I don't know if the data changed shape, or what, but it seemed like
train_data
is an array of tuples with 2 elements.