microsoft / SpeechT5

Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
MIT License
1.22k stars 114 forks source link

Single Task Training #77

Closed yangjiabupt closed 7 months ago

yangjiabupt commented 7 months ago

I would like to know how you ensure the richness of prompts in a single task as well as the richness of a single task during the multitask phase. Do you have a fixed number of prompts for each task, or did you use GPT to generate a batch?

XiaoshanHsj commented 7 months ago

We used a fixed number of GPT-4 generated prompts for each single-instruction tasks and multi-instruction tasks