sccn / clean_rawdata

Cleaning Raw EEG data
GNU General Public License v3.0
42 stars 17 forks source link

memory usage #9

Closed rgougelet closed 2 years ago

rgougelet commented 4 years ago

asr_process() splits the data into chunks according to available memory, however the moving_average() function within it ends up operating on the covariance matrix, thereby multiplying the size of each chunk by the number of channels. it also appears that it then doubles the size of the covariance matrix, as well. so, for example, if there are 32 channels, the chunks are 64x too large. this seems to lead to consistent out of memory errors.

simply dividing the chunk size by 2*nchan yields insufficiently small chunks (7 samples, in my case). i attempted to input each channelxchannel vector individually into the moving_average function, but this ends up running too slowly. perhaps there is a simpler solution i'm not seeing.

arnodelorme commented 4 years ago

@chkothe what do you think about changing the code according to the recommendation above?

arnodelorme commented 2 years ago

Closing this for now that we have set a fixed amount of RAM (64 Mb). Let us know if this continues to be an issue.