Langboat / Mengzi

Mengzi Pretrained Models
Apache License 2.0
534 stars 63 forks source link

What is the input format for the model to automatically generate marketing copy? #16

Closed Nipi64310 closed 3 years ago

Nipi64310 commented 3 years ago

hello langboat, thanks for sharing the good work. Regarding the automatically generated marketing copy in the paper

Given the input title and keywords, the models are required to generate a corresponding descriptive passage

What is the input of the model? Is it in the form of [cls] title [sep] [keywords1,keywords2,keywords3,keywords4] [sep] [kg11,kg12,kg13] [kg21,kg22,kg23]?

windysavage commented 3 years ago

want to ask the same question~~ what is the input format for "conditional text generation task"?

Ag2S1 commented 3 years ago

Hi @Nipi64310 , the marketing copywriting presentation is based on T5 architecture, [CLS], [SEP] are special tokens in BERT, which does not exist in T5.

You need to refer to Google's practice in T5 to convert your task into Seq2Seq form.

The model we have open sourced is a model with same architecture as T5 1.1 and does not include any downstream tasks. So if you want to do a demo similar to ours, you need to prepare the following data:

  1. the title and body of the marketing copy
  2. the keywords mentioned in the main text
  3. the knowledge graph of the marketing domain

Using above data, construct the training text pairs. There are various forms of text pairs, and we are exploring which one is better. Here is an example:

Input:
"title|keyword1, keyword2, keyword3|<entityA, relationX, entityB>, <entityC, relationY, entityD>"
Output: 
"body of the text"
Ag2S1 commented 3 years ago

If there are more questions, feel free to reopen this issue.