Letianwu / ZMCintegral

An easy way to use multi-GPUs to calculate multi-dimensional integration
https://arxiv.org/pdf/1902.07916v2.pdf
Apache License 2.0
19 stars 8 forks source link

Error When the function is too long #19

Open mahmoudasmar opened 4 years ago

mahmoudasmar commented 4 years ago

I have a very long expression that I want to integrate and i am getting the following error :

LLVM ERROR: out of memory Traceback (most recent call last): File "./Program.py", line 4382, in result = MC.evaluate() File "/home/.../ZMCintegral.py", line 82, in evaluate MCresult = self.importance_sampling_iteration(self.initial_domain, 0) File "/home/.../ZMCintegral.py", line 88, in importance_sampling_iteration MCresult_chunks, large_std_chunk_id, MCresult_std_chunks = self.MCevaluate(domain) File "/home/.../ZMCIntegral.py", line 140, in MCevaluate MCresult.append(np.load(os.getcwd()+'/multi_temp/result'+str(i_batch)+'.npy')) File "/home/.../miniconda3/envs/myenv/lib/python3.7/site-packages/numpy/lib/npyio.py", line 422, in load fid = open(os_fspath(file), "rb") FileNotFoundError: [Errno 2] No such file or directory: '/home/.../multi_temp/result0.npy'

Any suggestion on how to circumvent this problem ?

Letianwu commented 4 years ago

I suggest to use the new version of ZMCintegral and set the num_chunks_in_one_dimension to be small in the begining. Try to see if it works.

Juenjie commented 4 years ago

Would you please upload your code here so we can know exactly what happens. If it is simply a problem of memory, then it should be the problem of package Ray.