Closed Winbuntu closed 3 years ago
I was running MARS on large datasets without memory issues, but this might depend on the GPU you use. If one of your annotated datasets is very large, you can divide data in batches (currently each task is one batch -- line 259 in mars.py).
We switched to a Tesla V100 with 16GB memory and the out of memory error is gone :-) It would be of great help if you could implement this data generator in MARS. It certainly helpful for large dataset.
Hi, I am running MARS on a dataset containing approximately 20k cells. I was running on a 1080Ti, which has 11GB of memory. But 11GB GPU memory is not enough for MARS to process this dataset, and I got a CUDA out of memory error.
Is MARS able to use memory across multiple GPUs, so we can scale up to more than 20K cells? Or is there any other way to run MARS with large dataset?