I have used cardinal for many data sets without getting any errors and it works great every time. From normalising the spectra vectors to performing unsupervised and supervised clustering, Cardinal has it all.
But, recently I was processing a data set with ibd file of size 157 GB.
These are the two cases I tried -
I am reading the data from 70-200 m/z range at 10 ppm resolution with attached.only = TRUE. Then I am normalising using normalize function. It works fine.These are the commands -
data <- readImzML(name="brain_data",mass.range = c(70,200),as="MSImageSet",resolution=10,units="ppm",attach.only = TRUE)
norm_data <- normalize(data)
I am reading the data from 70-500 m/z range at 5 ppm resolution with attached.only = TRUE. Then I am normalising using normalize function. It works fine.These are the commands -
data <- readImzML(name="brain_data",mass.range = c(70,500),as="MSImageSet",resolution=5,units="ppm",attach.only = TRUE)
norm_data <- normalize(data)
Cardinal throws "Could not allocate memory" error for the second case. I tried normalising only a subset of pixels at a time and then combining the results but that also didn't work. The object "as.matrix(iData(data))" is around 250 GB in this case and my machine has 200 GB RAM.
Please suggest some way to resolve this. Any help will be appreciated. Thanks again for this great package.
I have used cardinal for many data sets without getting any errors and it works great every time. From normalising the spectra vectors to performing unsupervised and supervised clustering, Cardinal has it all. But, recently I was processing a data set with ibd file of size 157 GB. These are the two cases I tried -
I am reading the data from 70-200 m/z range at 10 ppm resolution with attached.only = TRUE. Then I am normalising using normalize function. It works fine.These are the commands -
I am reading the data from 70-500 m/z range at 5 ppm resolution with attached.only = TRUE. Then I am normalising using normalize function. It works fine.These are the commands -
Cardinal throws "Could not allocate memory" error for the second case. I tried normalising only a subset of pixels at a time and then combining the results but that also didn't work. The object "as.matrix(iData(data))" is around 250 GB in this case and my machine has 200 GB RAM.
Please suggest some way to resolve this. Any help will be appreciated. Thanks again for this great package.