Open mahmoudasmar opened 4 years ago
I suggest to use the new version of ZMCintegral and set the num_chunks_in_one_dimension
to be small in the begining. Try to see if it works.
Would you please upload your code here so we can know exactly what happens. If it is simply a problem of memory, then it should be the problem of package Ray.
I have a very long expression that I want to integrate and i am getting the following error :
LLVM ERROR: out of memory Traceback (most recent call last): File "./Program.py", line 4382, in
result = MC.evaluate()
File "/home/.../ZMCintegral.py", line 82, in evaluate
MCresult = self.importance_sampling_iteration(self.initial_domain, 0)
File "/home/.../ZMCintegral.py", line 88, in importance_sampling_iteration
MCresult_chunks, large_std_chunk_id, MCresult_std_chunks = self.MCevaluate(domain)
File "/home/.../ZMCIntegral.py", line 140, in MCevaluate
MCresult.append(np.load(os.getcwd()+'/multi_temp/result'+str(i_batch)+'.npy'))
File "/home/.../miniconda3/envs/myenv/lib/python3.7/site-packages/numpy/lib/npyio.py", line 422, in load
fid = open(os_fspath(file), "rb")
FileNotFoundError: [Errno 2] No such file or directory: '/home/.../multi_temp/result0.npy'
Any suggestion on how to circumvent this problem ?