codecaution / Awesome-Mixture-of-Experts-Papers

A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
533 stars 41 forks source link

fix an arxiv link #11

Open Spico197 opened 1 year ago

Spico197 commented 1 year ago

The link to A Review of Sparse Expert Models in Deep Learning should be https://arxiv.org/abs/2209.01667