lmrodriguezr / nonpareil

Estimate metagenomic coverage and sequence diversity
http://enve-omics.ce.gatech.edu/nonpareil/
Other
42 stars 11 forks source link

error in Nonpareil.curve #32

Closed IM-han closed 6 years ago

IM-han commented 6 years ago

Dear author I have a problem in Nonpareil.curve(). when i do Nonpareil.curve("134.npo"), it show an error that "Error in if (x$y.p50[twenty.pc] == 0)".
The file "134.npo" is as follow. I can not understand what is wrong with it?

# @impl: Nonpareil
# @ksize: 842085680
# @version: 3.20
# @maxL: 292
# @L: 244.584
# @R: 27811543
# @overlap: 50.00
# @divide: 0.70
0   0.00000 0.00000 0.00000 0.00000 0.00000
1   0.00000 0.00000 0.00000 0.00000 0.00000
2   0.00000 0.00000 0.00000 0.00000 0.00000
3   0.00000 0.00000 0.00000 0.00000 0.00000
4   0.00000 0.00000 0.00000 0.00000 0.00000
6   0.00000 0.00000 0.00000 0.00000 0.00000
9   0.00000 0.00000 0.00000 0.00000 0.00000
12  0.00000 0.00000 0.00000 0.00000 0.00000
18  0.00000 0.00000 0.00000 0.00000 0.00000
25  0.00000 0.00000 0.00000 0.00000 0.00000
36  0.00000 0.00000 0.00000 0.00000 0.00000
52  0.00000 0.00000 0.00000 0.00000 0.00000
74  0.00000 0.00000 0.00000 0.00000 0.00000
105 0.00000 0.00000 0.00000 0.00000 0.00000
151 0.00000 0.00000 0.00000 0.00000 0.00000
215 0.00000 0.00000 0.00000 0.00000 0.00000
307 0.00000 0.00000 0.00000 0.00000 0.00000
439 0.00000 0.00000 0.00000 0.00000 0.00000
627 0.00000 0.00000 0.00000 0.00000 0.00000
896 0.00000 0.00000 0.00000 0.00000 0.00000
1279    0.00000 0.00000 0.00000 0.00000 0.00000
1828    0.00000 0.00000 0.00000 0.00000 0.00000
2611    0.00000 0.00000 0.00000 0.00000 0.00000
3730    0.00000 0.00000 0.00000 0.00000 0.00000
5328    0.00000 0.00000 0.00000 0.00000 0.00000
7612    0.00000 0.00000 0.00000 0.00000 0.00000
10874   0.00000 0.00000 0.00000 0.00000 0.00000
15534   0.00000 0.00000 0.00000 0.00000 0.00000
22191   0.00000 0.00000 0.00000 0.00000 0.00000
31702   0.00049 0.01562 0.00000 0.00000 0.00000
45289   0.00179 0.03641 0.00000 0.00000 0.00000
64698   0.00049 0.01562 0.00000 0.00000 0.00000
92426   0.00068 0.01320 0.00000 0.00000 0.00000
132037  0.00371 0.02864 0.00000 0.00000 0.00000
188624  0.00358 0.02764 0.00000 0.00000 0.00000
269463  0.00342 0.01987 0.00000 0.00000 0.00000
384948  0.00835 0.02641 0.00000 0.00000 0.00000
549925  0.00910 0.02306 0.00000 0.00000 0.00000
785607  0.01305 0.02149 0.00000 0.00000 0.03030
1122296 0.01927 0.02201 0.00000 0.02083 0.02941
1603280 0.02646 0.02128 0.01538 0.02000 0.03774
2290400 0.03576 0.02038 0.02273 0.03297 0.04878
3272000 0.04735 0.01983 0.03279 0.04651 0.06087
4674286 0.06177 0.01718 0.04938 0.06135 0.07317
6677551 0.08123 0.01694 0.06944 0.08036 0.09266
9539359 0.10346 0.01524 0.09270 0.10826 0.11401
13627656    0.12778 0.01304 0.11895 0.12815 0.13608
19468080    0.15561 0.01099 0.14820 0.15549 0.16228
27811543    0.18379 0.00675 0.17907 0.18109 0.18813
lmrodriguezr commented 6 years ago

Hello @IM-han

It appears your problem is insufficient data, but it is a little unexpected given that you seem to have 28 million reads with ~300bp each. Could you please post the command you used?

Try using:

Nonpareil.curve("134.npo", enforce.consistency=FALSE)

Or, if you have an older R Nonpareil package, use:

Nonpareil.curve("134.npo", data.consistency=FALSE)
IM-han commented 6 years ago

I did as what you say, but it still showed error. The command was that "Nonpareil.curve("134/134.npo", enforce.consistency=FALSE)", and the error was that "Error in if (x$y.p50[twenty.pc] == 0) { : 参数长度为零". And the command "Nonpareil.curve("134/134.npo", data.consistency=FALSE)", it still get the same error "Error in if (x$y.p50[twenty.pc] == 0) { : 参数长度为零". You said that my data was insufficient, it means that my reads are too large?

lmrodriguezr commented 6 years ago

@IM-han it appears you executed a version of nonpareil that was up shortly and included a bug (see also #18). Please remove the line # @ksize: 842085680 from your 134.npo file and run it again.

For future runs please update your nonpareil version.

IM-han commented 6 years ago

Thank you very much. I get it! The problem is what you said.