Closed yanshuaibupt closed 1 year ago
input_text = "Harry Potter is a series of seven fantasy novels written by British author, [HL]J. K. Rowling[HL]." inp = tokenizer(input_text, return_tensors='pt') generation_output = model.generate(inputs=inp["input_ids"].to(model.device)) response = tokenizer.decode(generation_output[0], skip_special_tokens=True)
Thanks for your excellent job! And when I load the tokenizer and model from hugging face, how can I to use text as input to infer/generate a question?