facebookresearch / ParlAI

A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
https://parl.ai
MIT License
10.48k stars 2.1k forks source link

How to call query generator with multi-turn data? #4837

Closed yonatanbitton closed 1 year ago

yonatanbitton commented 1 year ago

Hi. I want to invoke query generator with multi-turn data. For example, I have the following dialogue: ["Hi, I love Joe Rogan", "Me too!", "How old is he?"]. How can I use a query generation model for inference on this dialog? I currently use this cmd line: parlai interactive -mf zoo:blenderbot2/query_generator/model Which allows feeding a single turn, not multi-turn.

An additional question - what's the python code for model inference? For example, to run model inference in some kind of for-loop. I didn't find relevant code in the documentation, nor in the ParlAI Tutorial. The tutorial colab only allows Interactive.main interaction.

Thanks

mojtaba-komeili commented 1 year ago

Starting with your second questions, as it has part of the answer to the first one as well. ParlAI help you to have inference using its command line tool. If you want to generate some metrics or examples for a model that already exists in ParlAI, all you need to do is to have your data prepared with a task/teacher, and run it with your model (check out eval_model in ParlAI, link). Now about the first question, the data for training this model comes from SearchQueryTeacher. You can see the model text input and output by running

parlai display_data --task wizard_of_internet:SearchQueryTeacher

You will see that the conversation turns are concatenated into a single string via joining them with \n character.

github-actions[bot] commented 1 year ago

This issue has not had activity in 30 days. Please feel free to reopen if you have more issues. You may apply the "never-stale" tag to prevent this from happening.