BlinkDL / RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Apache License 2.0
12.47k stars 847 forks source link

RWKV for Text to Speech use case #222

Closed rishikksh20 closed 7 months ago

rishikksh20 commented 8 months ago

Hi @BlinkDL Have you ever experimented how RWKV is performing for speech use cases like text to speech ?

BlinkDL commented 7 months ago

a community member tried this. join our discord https://discord.gg/bDSBUMeFpc and check https://discord.com/channels/992359628979568762/1016606571993763861/1180965095589806080

rishikksh20 commented 7 months ago

thanks @BlinkDL