Open LouisEchard opened 6 years ago
The data is too big, too disparate. Create a function while cleaning variables to load the right amount of diverse data.
Wait for new RAM
Use multithreading when new ram comes :
from multiprocessing.dummy import Pool as ThreadPool pool = ThreadPool(4) results = pool.map(my_function, my_array)
The data is too big, too disparate. Create a function while cleaning variables to load the right amount of diverse data.