premAI-io / state-of-open-source-ai

:closed_book: Clarity in the current fast-paced mess of Open Source innovation
https://book.premai.io/state-of-open-source-ai
Other
1.52k stars 89 forks source link

Mixture of Experts (MoE) #122

Closed Anindyadeep closed 10 months ago

Anindyadeep commented 10 months ago

Type

new chapter

Chapter/Page

models

Description

Although MoE has been there in general Deep learning, just now Mistral Released their second models with MoE, so now it is very important to release those also.