microsoft / SpeechT5

Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
MIT License
1.09k stars 113 forks source link

Single Task Training #77

Closed yangjiabupt closed 2 months ago

yangjiabupt commented 2 months ago

I would like to know how you ensure the richness of prompts in a single task as well as the richness of a single task during the multitask phase. Do you have a fixed number of prompts for each task, or did you use GPT to generate a batch?

XiaoshanHsj commented 2 months ago

We used a fixed number of GPT-4 generated prompts for each single-instruction tasks and multi-instruction tasks