kyegomez / LIMoE

Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
https://discord.gg/47ENfJQjMq
MIT License
18 stars 2 forks source link