Closed jontstaz closed 9 months ago
Hi @jontstaz,
You can use any model based on Llama, Falcon, and Bloom and stored in the standard HF Transformers-compatible format.
This includes all WizardCoder variants, e.g., WizardLM/WizardCoder-Python-34B-V1.0 repo.
However, you need to connect GPUs hosting all parts of the model to run it. If you don't have enough GPUs, you can join our Discord and ask if there's anyone else interested in sharing their GPUs for hosting this model (I think there was a few people asking about WizardCoder before).
The model will show up at https://health.petals.dev once someone starts hosting any of its parts.
Let us know if you have other questions!
Hi,
Are all Models available through Petals or only the ones listed on the website? For example, I'd love to try out WizardCoder via Petals but I can't see it on the health.petals.dev web page.
Thanks in advance,