Closed dan-garvey closed 2 months ago
LGTM, though maybe it would be good to get a number of models that rely on a certain model architecture, e.g. LLaMa-2, and include that as a way to quantify how many models these "representative" cases should lend their support to, instead of this relatively small list of models that makes it seem like we support 9 different hf model IDs.
e.g.
Huggingface Model Architecture SHARK-Turbine microsoft/resnet models ( 15 models ) 💚 stablediffusionapi SD1.5 models ( > 600 checkpoints ) 💚 stabilityAI SD2.1 models 💚 BERT based models (22000+ models!!!!!! ) 💚 something like that for the README and then we back up a subset with CI
Thats a great point, I'll update this
@powderluv I dont think it makes sense to add a bunch of variants of the same transformer base here, but let me know if you disagree. I was thinking we could keep a more exhaustive list elsewhere.