Closed arronKler closed 7 months ago
Hi! We definitely have the ability to support Mixtral and other MoE models (Hivemind, the library for decentralized DL used by Petals, was initially designed for mixtures-of-experts), but currently the team does not have enough bandwidth to implement them in Petals right away. I might have some time over the holidays to work on it, but if you (or someone else from the community) is willing to contribute that, it will probably be much faster
+1
+1
Hello,
I'm trying to implement Mixtral8x7B following this guide: https://github.com/bigscience-workshop/petals/wiki/Run-a-custom-model-with-Petals
I have some doubts when implementing the block.py and model.py files. Could you give me some support?
I would be very interested in contributing to the project.
Thank you.
Hello!
We added and merged support for Mixtral models: https://github.com/bigscience-workshop/petals/pull/553.
Just update servers for the new version of the petals.
Like the title says, do we have any plan or do we have the ability to support MoE models like Mixtral8×7B?