bowang-lab / scGPT

https://scgpt.readthedocs.io/en/latest/
MIT License
972 stars 183 forks source link

Questions about memory, CPU, and GPU requirements for pre-training and fine-tuning #158

Open pgrosu opened 6 months ago

pgrosu commented 6 months ago

Hi scGPT Team,

I really enjoyed reading the paper. It seems the software requires a significant amount of resources, and was wondering if you might have information on the following:

What are the pre-training and fine-tuning CPU, memory, and GPU requirements for the following:

Analysis Criteria

The number of genes can vary between 10,000 - 30,000 genes, and the number of cells might vary between 10,000 - 10 million.

If you might have timing information how long each might take, that would helpful to know in order for me to properly run it.

Thank you, Paul

dburkhardt commented 5 months ago

I'm also curious to know more about the compute resources used for model pre-training. Maybe I missed it in the manuscript, but I couldn't find information about how long pre-training took for 33M cells and what GPU(s) were used.

An example of one of the recent models used in the zero-shot tutorials would be excellent!