biocore / DEICODE

Robust Aitchison PCA from sparse count data
Other
33 stars 17 forks source link

Maximum recursion depth exceeded #30

Closed justinshaffer closed 5 years ago

justinshaffer commented 5 years ago

Hello,

I am receiving a plugin error from deicode when using in QIIME2:

"maximum recursion depth exceeded"

This is from running this code:

qiime deicode rpca \
  --i-table ~/danone/data/deblur_agp_danone_merged_150nt_sub3_sepp_only_noChl_noMit_fecal.qza \
  --p-min-feature-count 10 \
  --p-min-sample-count 500 \
  --o-biplot ~/danone/data/deblur_agp_danone_merged_150nt_sub3_sepp_only_noChl_noMit_fecal_rpca.qza \
  --o-distance-matrix ~/danone/data/deblur_agp_danone_merged_150nt_sub3_sepp_only_noChl_noMit_fecal_rpca_dist.qza

I'm using QIIME2 2018.11.

There are about 5000 samples in the data.

Thanks in advance for any insight.

cameronmartino commented 5 years ago

@justinshaffer Thank you for reporting this! Could you post the full traceback so I can see where exactly the recursion limit is being violated? Thanks again!

justinshaffer commented 5 years ago

@cameronmartino Sure thing. Thanks for taking a look!

`Traceback (most recent call last): File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/q2cli/commands.py", line 274, in call results = action(arguments) File "", line 2, in rpca File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/qiime2/sdk/action.py", line 231, in bound_callable output_types, provenance) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/qiime2/sdk/action.py", line 362, in _callableexecutor output_views = self._callable(view_args) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/deicode/q2/_method.py", line 20, in rpca table = table.to_dataframe().T.drop_duplicates() File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/frame.py", line 3535, in drop_duplicates duplicated = self.duplicated(subset, keep=keep) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/frame.py", line 3584, in duplicated ids = get_group_index(labels, shape, sort=False, xnull=False) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/sorting.py", line 95, in get_group_index return loop(list(labels), list(shape)) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/sorting.py", line 86, in loop return loop(labels, shape) ...(above three lines repeat many times)... File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/sorting.py", line 86, in loop return loop(labels, shape) File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/pandas/core/sorting.py", line 60, in loop stride = np.prod(shape[1:nlev], dtype='i8') File "/home/jpshaffer/software/miniconda3/envs/qiime2-2018.11/lib/python3.5/site-packages/numpy/core/fromnumeric.py", line 2498, in prod out=out, **kwargs) RecursionError: maximum recursion depth exceeded

Plugin error from deicode:

maximum recursion depth exceeded

See above for debug info. QIIME is caching your current deployment for improved performance. This may take a few moments and should only happen once per deployment. Usage: qiime emperor biplot [OPTIONS] Try "qiime emperor biplot --help" for help.`

cameronmartino commented 5 years ago

@justinshaffer see this pull request in pandas. Try updating to greater or equal to pandas 0.23.2 and see if that fixes the problem. Additionally, I have removed the bit of code causing this error in pull request #29 so that should also solve the problem.

justinshaffer commented 5 years ago

@cameronmartino Thanks! The old version of Pandas is an artifact of my using an old QIIME2 build (2018.11) without updated dependencies. Upon using build 2019.1 things seem to work.

cameronmartino commented 5 years ago

@justinshaffer Great! I am going to close this issue then. If you see the issue again feel free to reopen it.