Closed AndreaAI closed 7 years ago
Hi Andrea,
Just to confirm, ls ./data/torch/dict-hash.txt
works just fine from that same directory, yes?
Hi,
The file is in the same directory, created with first lines of build_hash.sh (th build_dict_hash.lua \ ./data/torch/dict-hash.txt ./data/torch/dict.txt ./data/entities_1.txt 10000), that works fine. But when I run the second part, it identifies the file as nil. I used "assert" function while trying to load the dictionary and it says "no such file or directory". (Sorry, I am a bit inexperienced with all this stuff). Thank you!
First off, let me check: from the readme, you don't need to build the hash yourself if you ran setup_processed_data.sh
.
Otherwise, then you've done the following already (from the readme)...
./setup_data.sh
./gen_wiki_windows.sh
./gen_multidict_questions.sh
./build_dict.sh
./build_data.sh
...and that all worked fine?
In which case now you're trying to run build_hash.sh
, and...
th build_dict_hash.lua ./data/torch/dict-hash.txt ./data/torch/dict.txt ./data/entities_1.txt 1000000`
...worked, but...
th ../../scripts/build_hash.lua params \
"./data/torch/wiki-w=0-d=3-i-m.txt.hash" \
"./data/torch/wiki-w=0-d=3-i-m.txt.vecarray" \
dictFile="./data/torch/dict-hash.txt" \
memHashFreqCutoff=10000
...is not working?
I downloaded to the processed data but I wanted to run all the steps because later I want to try the model with another data.
I think now it is working perfectly, I'm going to try now to train the model. I made a mistake while copying the build_hash.lua code and I didn't notice. Sorry for losing your time and really thank you so much for your help!
Oh good I'm glad it's all working!
hello how to build hashed memories from kb? I couldn't find a script for it
Hello,
I have a problem while running the file "build_hash.sh". When it tries to run "build_hash.lua", it does not detect dictFile="./data/torch/dict-hash.txt" \, I get the error that dict-hash.txt is nil, but it has the processed information. I cannot fix the error, do you know how can I solve it?
Thank you so much,
Andrea