Luffy03 / Large-Scale-Medical

[CVPR 2024 Extension] 160K volumes (42M slices) datasets, new segmentation datasets, 31M-1.2B pre-trained models, various pre-training recipes, 50+ downstream tasks implementation
Apache License 2.0
100 stars 7 forks source link

Data Size for all downstrem datasets #6

Closed cheliu-computation closed 1 month ago

cheliu-computation commented 1 month ago

First Thanks for your impressive work and huge contirbution for MedIA community!

Since you have uploaded 50+ downstream tasks datasets on one HF repo, may I ask how much disk size should we prepare if we want to download all datasets in one time?

Luffy03 commented 1 month ago

Hi, thank you very much for your attention and sorry for my late reply. The downstream datasets require about 13.5 TB storage while the CT-RATE dataset requires 12TB. So if you only want other datasets (except CT-RATE), you need to prepare about 1.5 TB storage.

Luffy03 commented 1 month ago

Hi, sorry, I forgot to count autopet. It requires ~100G extra storage.

cheliu-computation commented 1 month ago

Thanks a lot!