Closed tcaceresm closed 7 months ago
@tcaceresm Great question. I'm not exactly sure where the problem is coming from, but an educated guess is somewhere under the hood from ChemmineOB or Open Babel, something like a memory leak.
From the R perspective, I have an alternative solution, see this blog post on using callr to create more robust wrappers for similar batch processing tasks: https://nanx.me/blog/post/disposable-computing-with-callr/.
In brief, split up the fingerprint calculations into much smaller batches, and use callr to launch new, separate R processes to go through each batch. This can potentially eliminate runtime issues related to low-level code.
Yup, the solution was to split up the fingerprint calculations! Thanks, I'll close this issue.
Hi there, I'm trying to calculate fingerprints for ~50,000 molecules. However, I notice that the RAM usage only increases, to the point of completely depleting it. I don't understand how it is possible given that the matrix created by the function
extractDrugOBFP4
to store the fingerprints is previously created, with the correct dimensions. Upon review, the size of the matrix is constant (~1.6gb) in each loop, however, the RAM usage by the R session increases as the loop continues. Furthermore, the process is sequential, molecule by molecule, which should not increase RAM usage. This is the code of the function, and the section of the function that increases RAM usage. I know that this is the problematic section because when I change it to any vector of size 512 (not the fingerprint returned by ChemmineOB), the process does not consume more ram. I'm not R expert, any help will be useful. Thanks, and sorry about my english.