Open Nuturetree opened 3 weeks ago
thanks for your reply. according to your suggestion, we change the chromosome name in TAD file and cool file. but the err not resolved.
Hi @Nuturetree ,
Thank you for your question.
We have fixed the issue and updated diffDomain. Feel free to try running it with the .cool file.
As for my previous response, it is my oversight.Thank you for your understanding. Please note that you do not have to make any modifications to the first column of the TAD.bed file.
Thank you for your support of diffDomain! If you have any further questions or need assistance, please don’t hesitate to reach out.
Thank you for your reply @liuyc27. I'd like to confirm whether the normalization step for cool files (e.g., KR) is still retained after software updates. The older version of diffdomain used to generate a cool_KR file, but this file seems to be absent in the updated version. Looking forward to your reply, I will continue to promote the value of diffdomains.
Hi @Nuturetree, the latest diffDomain still retains the normalization steps for cool files (e.g., KR). The generated new files will be saved in the same path as the input cool files, with the new file name being the original file name plus the suffix {reso/1000}k_{hicnorm}.cool
. For example, if the input file is A_KR.cool
, the generated new file name will be A_KR_20k_KR.cool
. Please note that if the input file is A_KR.cool
and A_KR_20k_KR.cool
already exists in that path, diffDomain will directly use A_KR_20k_KR.cool
as input without generating a new one.
Hi Author, The diffDomain is a good tools. when i run it, the err can not be solved, please help me, thanks.
The input script cool1=A_KR.cool cool2=B_KR.cool tad=TAD.bed result=result.tsv python ~/miniconda3/envs/diffdomain/lib/python3.7/site-packages/diffdomain_py3/diffdomains.py dvsd multiple ${cool1} ${cool2} ${tad} --reso 20000 --ofile ${result} --ncore 1
The input file test_dataset.zip
The err:
multiprocessing.pool.RemoteTraceback: """ Traceback (most recent call last): File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/multiprocessing/pool.py", line 121, in worker result = (True, func(*args, kwds)) File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/diffdomain_py3/diffdomains.py", line 59, in comp2domins_by_twtest_parallel fhic0=opts[''], fhic1=opts[''],min_nbin=int(opts['--min_nbin']),f=opts['--f'])
File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/diffdomain_py3/utils.py", line 379, in comp2domins_by_twtest
Diffmatnorm = normDiffbyMeanSD(D=Diffmat)
File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/diffdomain_py3/utils.py", line 260, in normDiffbyMeanSD
b[k] = np.max(val1)
File "<__array_function__ internals>", line 6, in amax
File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 2755, in amax
keepdims=keepdims, initial=initial, where=where)
File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 86, in _wrapreduction
return ufunc.reduce(obj, axis, dtype, out, passkwargs)
ValueError: zero-size array to reduction operation maximum which has no identity
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/site-packages/diffdomain_py3/diffdomains.py", line 76, in
result.append(i.get())
File "/public/home/yqzhang/miniconda3/envs/diffdomain/lib/python3.7/multiprocessing/pool.py", line 657, in get
raise self._value
ValueError: zero-size array to reduction operation maximum which has no identity