-
Hello,
I can't figure out how to generate the `gru.t7` torch file.
`003_skipthoughts_porting` generates `vqa_uni_gru_word2vec.t7`
However this seems to be only a torch tensor without the `rnn`'s …
-
I have installed Bazel,TensorFlow, NumPy, scikit-learn,Natural Language Toolkit (NLTK)and gensim.
then i downloaded:
skip_thoughts_uni_2017_02_02 and skip_thoughts_uni_2017_02_02.
skip-th…
-
XXX@XXX:~/codes/python_codes/text-to-image$ python generate_thought_vectors.py --caption_file="Data/sample_captions.txt"
['the flower shown has yellow anther red pistil and bright red petals']
Loadi…
-
Hi, I had downloaded the data_prepro.h5, data_prepro.json and seconds.json from the google drive link that you have shared. Also I had generated the data_res.h5 file by running prepro_res.lua. However…
a7b23 updated
6 years ago
-
I'm trying to train a custom decoder. I am using the bookcorpus data with science fiction novels. According to the docs:
https://github.com/ryankiros/skip-thoughts/blob/master/decoding/README.md
"We…
-
#### What I did
- Downloaded the pre-trained model
- Created a file (_j.caption_) with a sample caption
- Ran: `python generate_thought_vectors.py --caption_file=j.caption`
- Got the following error, …
-
-
Hi there. Thank you so much for the amazing models. When I try to run your neural storyteller example with your pretrained models, and call the function generate.load_all(), everything loads correct…
-
Hello Ryan,
Thanks a lot for the excellent code. I was trying to train the Bi-skip model mentioned in your paper on my dataset. However only the uni-skip model seems to be present in the training cod…