huggingface / model-evaluator

Evaluate Transformers from the Hub 🔥
https://huggingface.co/spaces/autoevaluate/model-evaluator
Apache License 2.0
13 stars 7 forks source link

Configure larger disks for bigger models #59

Closed mathemakitten closed 2 years ago

mathemakitten commented 2 years ago

Minimal example test:

DISK_NEEDED_FOR_LARGE_MODELS = {"opt-66b": 200}
selected_models = ['opt-66b', 'opt-13b']
size_of_models_on_disk = sum(filter(None, [DISK_NEEDED_FOR_LARGE_MODELS.get(model) for model in selected_models]))
max(size_of_models_on_disk, 150)

200

mathemakitten commented 2 years ago

I'm in favor of just making the disk space large enough for all zero shot submissions, so we don't need the large model dict. Disk space shouldn't cost that much anyway unless I'm missing something.

OK! In that case I've just bumped up the default to 200 GB, which accounts for 145GB for 66B and then some for the system, saving preds, etc. It also assumes that we'll route requests for models > 66B elsewhere (for now; we can change it when we start supporting 175B+ inference)—let me know if that works.