minyoungg / vqtorch

MIT License
101 stars 9 forks source link

Request for new features: Diverse codebook size in RVQ #8

Open Jyonn opened 9 months ago

Jyonn commented 9 months ago

Thanks for your contribution to propose this inspiring work. I would like to request for the support of different codebook size in residual quantization and weighted loss function. It might be conflict with the share attribute, but I think it would be a reasonable extension. Also, since the generated codes are in a course-to-fine manner, the first primary codes should have a larger loss weight during training.

I hope you and your team can consider these two features, and I think other quantization variants (e.g., product quantization) are also compatible with them.