huggingface / notebooks

Notebooks using the Hugging Face libraries 🤗
Apache License 2.0
3.59k stars 1.51k forks source link

Longform QA: Alternative datasets not working #2

Open fannnu opened 4 years ago

fannnu commented 4 years ago

Longform QA notebook uses the wiki40b dataset, which is huge to download and work with. I couldn't get other alternative datasets to work with it as the model expects the format of wiki40b.

What should be the correct approach to get any text corpus to work with this notebook?

ratthachat commented 4 years ago

Same here! I have failed the wiki40b 3 times in Colab (disconnected before finish processing wiki40b) ... Now I am trying my 4th time :)

BTW, since FAISS processing takes around 18hours, so it's not practical with Colab, is it possible that we can access wiki40b_passages_reps_32_l-8_h-768_b-512-512.dat without computing it by ourselves?

fannnu commented 4 years ago

@yjernite Hi Yacine, kindly advise us a way out. As @ratthachat said, any way to access wiki40b without unwrapping it ourselves + way to put any text corpus of our choice?

Also, can bringing a QA model trained on SQuaD 2.0 help minimize the instances where it gets factually inconsistent while generating an explanation text? There were fascinating researches with KGs and RL that might help with keeping factual accuracy in check

yjernite commented 4 years ago

Hi @fannnu and @ratthachat !

Since the model is open-domain you will need a Wikipeda-sized knowledge source, and Wiki40b is actually the smallest version I could find. If it consistently fails downloading, I suggest opening an issue on the https://github.com/huggingface/nlp repository.

We are working on providing pre-computed embeddings, but it's going to be a few days/week still, stay tuned!

Joint training with factoid-based QA datasets is definitely an interesting research direction to pursue, let me know if that works for you! You'll probably want to do generative QA for both (i.e. generate the answer with a decoder rather than point to a span in the input)

Also, in general, the HuggingFace Discuss website is probably a better place for this kind of discussions, feel free to ping me there!

https://discuss.huggingface.co/

ratthachat commented 4 years ago

Thanks @yjernite ! Pre-computed embeddings look great, and I am looking forward to it!

Yes, this research direction is very fascinating, and I hope I can make some contributions ; also discuss with you guys soon in the Discuss website!

LeoVS09 commented 3 years ago

If someone, like I, found this issue in the hope to find wiki40b_passages_reps_32_l-8_h-768_b-512-512.dat, then I have for you this embedding in my gdrive

It took to me 30 hours to calculate on my GTX 1060, and hopefully, no one will need to do it again :)