facebookresearch / ParlAI

A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
https://parl.ai
MIT License
10.49k stars 2.1k forks source link

Missing Models #493

Open jaseweston opened 6 years ago

jaseweston commented 6 years ago

This is a list of models not yet in ParlAI that would be great to have. Feel free to add more to the list also! We will remove individual items when they are done.

jsedoc commented 6 years ago

@jaseweston for HRED there's already changes made in Julian's fork of ParlAI (https://github.com/julianser/ParlAI).

theSage21 commented 6 years ago

If nobody is doing bidaf I can add it. It will take me some time though.

jaseweston commented 6 years ago

If nobody is doing bidaf I can add it. It will take me some time though.

@theSage21 sure that would be great!

theSage21 commented 6 years ago

I can lay out the code like in the drqa system?

jaseweston commented 6 years ago

@alexholdenmiller can give advice, perhaps

uralik commented 6 years ago

@theSage21, I guess right now the best thing is to use TorchAgent as a base class, check seq2seq agent for example

alexholdenmiller commented 6 years ago

Yes we definitely prefer using the TorchAgent parent class e.g. how seq2seq or memnn or example_seq2seq are set up. It eliminates a lot of copy-pasta from the model.

alexholdenmiller commented 6 years ago

I should caveat my recommendation: if you're not using pyorch then there will be a few inefficiencies (e.g. casting the torch tensors into another format), but it will still likely simplify the code. You're certainly welcome to roll it from scratch, the TorchAgent (parlai/core/torch_agent) just includes a lot of basic stuff like remembering the conversation history, vectorizing the text, and putting it into batches to feed into the model.

ricsinaruto commented 5 years ago

Commenting here to let interested people know, that I have a somewhat working integration of the VHCR model on this fork: https://github.com/Mrpatekful/ParlAI/tree/dialogwae. VHCR is a state-of-the-art dialog model, and I used the official implementation (https://github.com/ctr4si/A-Hierarchical-Latent-Structure-for-Variational-Conversation-Modeling).

The model is far from done however, I haven't really tested it yet (the loss seems to at least go down). I am still working on the generating function at test time, and I haven't even thought about how to integrate beam search yet. I plan to send a PR when I finish these tasks. I am happy to collaborate if anyone is up to it.

alexholdenmiller commented 5 years ago

Thanks for the updates @ricsinaruto!

I wanted to quickly note that @stephenroller landed #1260 two weeks ago, which provides a lot of the wrapping around typical generator code. This makes the seq2seq code at parlai/agents/seq2seq/seq2seq.py remarkably short in the current version, and includes functionality for doing beam search for you. You might find it quite a bit easier to rebase and subclass this new TorchGeneratorAgent (parlai/core/torch_generator_agent.py).

ricsinaruto commented 5 years ago

Yeah I knew about that but thanks for bringing it to my attention. So far I actually subclassed the seq2seq because it has a lot of funcionality, but I will change it to this new generator agent as it would be cleaner.

alexholdenmiller commented 5 years ago

Yes in the master branch nearly all of the functionality you were using in your fork has been moved to the TorchGeneratorAgent, actually!

github-actions[bot] commented 4 years ago

This issue has not had activity in 30 days. Marking as stale.

agilebean commented 3 years ago

In my experiments with BlenderBot 1.0, the 1B was nearly as fast as the 400M model but showed much better conversational performance. The 1B was also much faster than the 3B model.

Therefore, may I ask the BlenderBot 2.0 team @stephenroller @alexholdenmiller et al.: Any chance you consider releasing a 1B model also for BlenderBot 2.0 soon?

I guess this might benefit many other people as well :)