mzjb / DeepH-pack

Deep neural networks for density functional theory Hamiltonian.
GNU Lesser General Public License v3.0
243 stars 51 forks source link

ArrayMemoryError: Unable to allocate 6.03 TiB for an array with shape (910224, 910224) and data type float64 #77

Open sgyang345 opened 4 months ago

sgyang345 commented 4 months ago

The matrix dimension is very large, resulting in an attempt to allocate a very large array resulting in low memory. It is recommended to process data in chunks: Developers can consider dividing data into smaller chunks for processing. This avoids allocating a large amount of memory at once.


deeph-inference --config inference.ini

User config name: ['inference.ini']



~~~~~~~ 2.get_local_coordinate

~~~~~~~ 3.get_pred_Hamiltonian

~~~~~~~ 4.rotate_back

~~~~~~~ 5.sparse_calc, command: 
xxx/sparse_calc.jl --input_dir xxx/get_S_process --config 

####### Begin 1.parse_Overlap
Output subdirectories: OUT.ABACUS
Traceback (most recent call last):
  File "xxx/deeph-inference", line 8, in <module>
    sys.exit(main())
  File "xxx/deeph/scripts/inference.py", line 97, in main
    abacus_parse(OLP_dir, work_dir, data_name=f'OUT.{abacus_suffix}', only_S=True)
  File "xxx/deeph/preprocess/abacus_get_data.py", line 247, in abacus_parse
    overlap_dict, tmp = parse_matrix(os.path.join(input_path, "SR.csr"), 1)
  File "xxx/deeph/preprocess/abacus_get_data.py", line 215, in parse_matrix
    hamiltonian_cur = csr_matrix((np.array(line2).astype(float), np.array(line3).astype(int),
  File "xxx/site-packages/scipy/sparse/_compressed.py", line 1051, in toarray
    out = self._process_toarray_args(order, out)
  File "xxx/site-packages/scipy/sparse/_base.py", line 1298, in _process_toarray_args
    return np.zeros(self.shape, dtype=self.dtype, order=order)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.03 TiB for an array with shape (910224, 910224) and data type float64