berkeley-hipie / HIPIE

[NeurIPS2023] Code release for "Hierarchical Open-vocabulary Universal Image Segmentation"
https://people.eecs.berkeley.edu/~xdwang/projects/HIPIE/
MIT License
260 stars 19 forks source link

May I ask how long the model training will last #9

Open wuxuanttt opened 11 months ago

KKallidromitis commented 11 months ago

Hi, the training can vary greatly depending on the setup you use (e.g. number and size of GPUs, batch size and datasets). If you are referring to pre-training on our setup using 8 A100 80GB with batch size 2 per GPU it takes around 15 days.