Open MdYeasinSamadArnob opened 2 years ago
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("voidful/bart-distractor-generation-pm")
model = AutoModelForSeq2SeqLM.from_pretrained("voidful/bart-distractor-generation-pm")
example = "When you ' re having a holiday , one of the main questions to ask is which hotel or apartment to choose . However , when it comes to France , you have another special choice : treehouses . In France , treehouses are offered to travelers as a new choice in many places . The price may be a little higher , but you do have a chance to _ your childhood memories . Alain Laurens , one of France ' s top treehouse designers , said , ' Most of the people might have the experience of building a den when they were young . And they like that feeling of freedom when they are children . ' Its fairy - tale style gives travelers a special feeling . It seems as if they are living as a forest king and enjoying the fresh air in the morning . Another kind of treehouse is the ' star cube ' . It gives travelers the chance of looking at the stars shining in the sky when they are going to sleep . Each ' star cube ' not only offers all the comfortable things that a hotel provides for travelers , but also gives them a chance to look for stars by using a telescope . The glass roof allows you to look at the stars from your bed . </s> The passage mainly tells us </s> treehouses in france."
tokenizer.batch_decode(model.generate(tokenizer.encode("example",return_tensors='pt'),num_return_sequences=3,num_beams=3))
you can refer to here: https://github.com/voidful/BDG/blob/main/BDG_selection.ipynb
Hey, if I give context, question and answer as input then I get one distractor. Is 3 distractors possible?