nickjcroucher / gubbins

Rapid phylogenetic analysis of large samples of recombinant bacterial whole genome sequences using Gubbins
http://nickjcroucher.github.io/gubbins/
GNU General Public License v2.0
175 stars 51 forks source link

Error after 5 iterations #365

Closed mudymudy closed 1 year ago

mudymudy commented 1 year ago

Hi!

After the 5th iteration was finished, I got this error... any help or guidance? Because I don't know how to interpret the error message. I used this command: run_gubbins.py -p gubbins --threads 30 --filter-percentage 67 clean.full.aln

Thank you in advance!

Running Gubbins to detect recombinations... gubbins -r -v clean.full.aln.gaps.vcf -a 100 -b 10000 -f /home/usr/data/alignment/snippy_3x/tmpnsd87v8l/clean.full.aln -t clean.full.iteration_5.tre -m 3/home/usr/data/alignment/snippy_3x/tmpnsd87v8l/cle an.full.iteration_5.internal.joint.aln ...done. Run time: 372509.90 s

Checking for convergence... Traceback (most recent call last): File "/home/usr/anaconda3/bin/run_gubbins.py", line 33, in sys.exit(load_entry_point('gubbins==3.0.0', 'console_scripts', 'run_gubbins.py')()) File "/home/usr/anaconda3/lib/python3.9/site-packages/gubbins/run_gubbins.py", line 133, in main gubbins.common.parse_and_run(parser.parse_args(), parser.description) File "/home/usr/anaconda3/lib/python3.9/site-packages/gubbins/common.py", line 349, in parse_and_run remove_internal_node_labels_from_tree(current_tree_name_with_internal_nodes, current_tree_name) File "/home/usr/anaconda3/lib/python3.9/site-packages/gubbins/common.py", line 822, in remove_internal_node_labels_from_tree tree = dendropy.Tree.get_from_path(input_filename, 'newick', preserve_underscores=True) File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/datamodel/basemodel.py", line 217, in get_from_path return cls._parse_and_create_from_stream(stream=fsrc, File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/datamodel/treemodel.py", line 2659, in _parse_and_create_from_stream tree_lists = reader.read_tree_lists( File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/dataio/ioservice.py", line 370, in read_tree_lists product = self._read(stream=stream, File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/dataio/newickreader.py", line 324, in _read for tree in self.tree_iter(stream=stream, File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/dataio/newickreader.py", line 301, in tree_iter tree = self._parse_tree_statement( File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/dataio/newickreader.py", line 363, in _parse_tree_statement current_token = nexus_tokenizer.require_next_token()
File "/home/usr/anaconda3/lib/python3.9/site-packages/dendropy/dataio/tokenizer.py", line 160, in require_next_token raise exc dendropy.dataio.tokenizer.Tokenizer.UnexpectedEndOfStreamError: Error parsing data source 'clean.full.iteration_5.tre.internal' on line 1 at column 0: Unexpected end of stream

nickjcroucher commented 1 year ago

This seems odd - this looks like an I/O error - is there any chance that you ran out of storage space? Can you view the clean.full.iteration_5.tre.internal tree in something like FigTree?

mudymudy commented 1 year ago

Sorry for the late reply! Yes, the problem was that I ran out of space. Thank you!