SkunkworksAI / hydra-moe

410 stars 15 forks source link

Immediate To-Dos: #5

Open pharaouk opened 1 year ago

pharaouk commented 1 year ago

Immediate To-Dos: Improve the Multilora PEFT class extension code ( @sumo already has an implementation and will push it shortly) Gating needs to be standardized to enable flexible switching of expert adapters from a larger db of adapters (likely through centroid/similarity measures) UI to run MoE inference and Base Model inference side by side (w streaming and display of selected experts during inference) simplifying the process of Finetuning new experts and adding them to the MoE arch

lpietrobon commented 1 year ago

I'll make separate issues for each of these, so that we can address them one by one