Closed laraib85 closed 7 years ago
Can you show me your input Hi-C data (with linux head
command)?
Sure, here is one of them (the columns are tab-delimited):
6027 6283 2.0955317252
16674 16713 1.28669960457
21722 22631 5.54419196153
535 3421 14.9422995601
11675 11884 5.03226812195
23163 24634 8.58252500088
20091 20461 4.78734712728
19762 23175 20.3328842376
1132 1407 2.02352411334
17299 23011 25.4211912422
And did you get the same error when followed the tutorial (http://xiaotaowang.github.io/TADLib/hitad.html) with those example data on your system?
And the version of your python and scipy?
It runs on the example data but crashes while parsing my input data. Scipy version 0.18.1.
That error seems to be related to this issue (http://bugs.python.org/issue11564), that is, cPickle has trouble in dumping large objects. However, I think it doesn't explain your case, at least for low-resolution inputs. Would you mind emailing your input data on 500Kb to me (wangxiaotao868@163.com)?
Hey, the head of your "chr1.txt" is shown below::
120.54 125.66 2.0955317252
333.48 334.26 1.28669960457
434.44 452.62 5.54419196153
10.7 68.42 14.9422995601
233.5 237.68 5.03226812195
463.26 492.68 8.58252500088
401.82 409.22 4.78734712728
395.24 463.5 20.3328842376
22.64 28.14 2.02352411334
I think the float type of the first two columns could explain the weird error.
Oh yeah. I mixed up the resolution of some files and ended up doing that! I just tried the tool on the 5k data to cross-check. Works well. Thanks!
Hi,
I tried running the tool on processed Hi-C data from the Rao et al paper with the KR normalization method. However, each time I get the following error:
I tried multiple resolutions as well, ranging from 5kb to 500kb but to no avail.