mzjb / DeepH-pack

Deep neural networks for density functional theory Hamiltonian.
GNU Lesser General Public License v3.0
219 stars 46 forks source link

a user warning during inference step #38

Closed JTaozhang closed 1 year ago

JTaozhang commented 1 year ago

Hi, There, when I am doing the inference part, all steps before inference step all finished normally, but the program remind me this warining (below). is this warning we can ignore or not? if not, how can we avoid (remove) it?

/share/home/zhangtao/anaconda3/envs/ZT-py39/lib/python3.9/site-packages/deeph/kernel.py:53: UserWarning: Unable to copy scripts warnings.warn("Unable to copy scripts")

Here I also list my inference setting:

[basic] work_dir =/work/deeph-test/workdir/inference3 OLP_dir = //work/deeph-test/workdir/olp interface = openmx structure_file_name = POSCAR trained_model_dir = /work/deeph-test/workdir/trained_model/2023-04-19_11-29-45 task = [1, 2, 3, 4, 5] sparse_calc_config =/work/deeph-test/workdir/inference3/band.json dense_calc = True disable_cuda = False device = cuda:0 huge_structure = True

gen_rc_idx = False gen_rc_by_idx = with_grad = False

[interpreter] julia_interpreter = ***/software/julia-1.6.6/bin/julia

[graph] radius = -1.0 create_from_DFT = True

the band.json setting is : { "calc_job": "band", "which_k": 0, "fermi_level": 0, "lowest_band": -10.3, "max_iter": 300, "num_band": 100, "k_data": ["46 0.3333333333333333 0.6666666666666667 0 0 0 0 K Γ", "28 0 0 0 0.5 0.5 0 Γ M", "54 0.5 0.5 0 0.6666666666666667 0.3333333333333333 0 M K'"] }

One more question,

the program seems to be stucked at 3.get_pred_Hamiltonian, because the output file is not updated in the working directory. The latested updated time is 12:47, 26/04/2023. After that time, the files never have any change, but the program still is running now(17:47, 26/04/2023).

image ![Uploading image.png…]()

Much appreciation for your kind help.

Best regard,

mzjb commented 1 year ago

Hi.

For the first issue, DeepH-pack will copy the code to the $(work_dir) for backup purposes. The warning you are seeing is usually caused by the presence of files in the $(work_dir)/pred_ham_std/src directory, which prevents the copying process.

As for the second issue, I'm not entirely sure what's causing it, but you could try adding restore_blocks_py = False under the [basis] section in the inference ini configuration file to see if that helps.

JTaozhang commented 1 year ago

Hi,

So, the first issue is because the file already existed and it would not affect the work of the program, right.

As for the second problem, I test it soon, and I think may be is caused by the large number of kpoints set in the KPATH.

Many thanks for your help . Best regard, Tao

发自我的 iPhone

在 2023年4月27日,17:54,李贺 (He Li) @.***> 写道:



Hi.

For the first issue, DeepH-pack will copy the code to the $(work_dir) for backup purposes. The warning you are seeing is usually caused by the presence of files in the $(work_dir)/pred_ham_std/src directory, which prevents the copying process.

As for the second issue, I'm not entirely sure what's causing it, but you could try adding restore_blocks_py = False under the [basis] section in the inference ini configuration file to see if that helps.

— Reply to this email directly, view it on GitHubhttps://github.com/mzjb/DeepH-pack/issues/38#issuecomment-1525319204, or unsubscribehttps://github.com/notifications/unsubscribe-auth/A6AD2GK5CR7OJX3TVX2FPCLXDI633ANCNFSM6AAAAAAXNTBGFE. You are receiving this because you authored the thread.Message ID: @.***>

mzjb commented 1 year ago

I am not noticing that you were already performing the fifth step of band structure calculation, which means your issue is not related to restore_blocks_py.

Regarding the number of atoms in your material, if it exceeds 100, you should use sparse diagonalization instead of dense diagonalization. To do so, you should set dense_calc to False.

I suggest that you avoid replying to GitHub issues via email, as this will display additional irrelevant content on the GitHub web page.