wdecoster / NanoPlot

Plotting scripts for long read sequencing data
http://nanoplot.bioinf.be
MIT License
401 stars 48 forks source link

Problem ran NanoPlot #336

Closed gosalone closed 10 months ago

gosalone commented 10 months ago

Hi @wdecoster

I installed Nanoplot via conda. The tool was installed well, I checked the help and version option, everything is ok. But when I ran the tool I got the below error

libgcc_s.so.1 must be installed for pthread_cancel to work
Abandon (core dumped)

libgcc is already installed in our server

ls /lib/libgcc* /lib/libgcc_s-4.8.5-20150702.so.1 /lib/libgcc_s.so.1

The command I used NanoPlot --fastq_rich barcode.fastq.gz --title barcode --N50 --no_static --outdir Rapport_QC_Nanopore

I bring to your attention that I installed NanoFilt it works without any error, on the other hand NanoPlot and also NanoStat, always the problem of libgcc.

wdecoster commented 10 months ago

I have never seen anything like that. Googling showed that the same error message is reported for various other tools, so I don't think this is something specific to NanoPlot.

One suggestion that I saw: try conda install compilers.

Cheers. Wouter

gosalone commented 10 months ago

I ran the compliers command, but still the same error.

Can NanoPlot be installed via Singularity?

wdecoster commented 10 months ago

I have never done so, but I think it should be possible

https://biocontainer-doc.readthedocs.io/en/latest/source/nanoplot/nanoplot.html

Or with docker...

gosalone commented 10 months ago

I installed NanoPlot via SIF :

`./singularity pull --name NanoPlot.sif docker://quay.io/biocontainers/nanoplot:1.41.6--pyhdfd78af_0 ``

./singularity exec NanoPlot.sif NanoPlot --version
1.41.6

NanoPlot is well installed but when launching I see a strange error :

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/bin/NanoPlot", line 10, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/site-packages/nanoplot/NanoPlot.py", line 61, in main
    datadf = get_input(
  File "/usr/local/lib/python3.10/site-packages/nanoget/nanoget.py", line 110, in get_input
    dfs=[out for out in executor.map(extraction_function, files)],
  File "/usr/local/lib/python3.10/site-packages/nanoget/nanoget.py", line 110, in <listcomp>
    dfs=[out for out in executor.map(extraction_function, files)],
  File "/usr/local/lib/python3.10/concurrent/futures/process.py", line 575, in _chain_from_iterable_of_lists
    for element in iterable:
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 621, in result_iterator
    yield _result_or_cancel(fs.pop())
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 319, in _result_or_cancel
    return fut.result(timeout)
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
TypeError: Cannot use .astype to convert from timezone-aware dtype to timezone-naive dtype. Use obj.tz_localize(None) or obj.tz_convert('UTC').tz_localize(None) instead.
gosalone commented 10 months ago

Great, it works without any errors via singularity when I use --fastq instead of --fastq_rich. Do you have any idea why and will there be a change in the results ?

wdecoster commented 10 months ago

Hrm, that timezone problem should already be fixed... (https://github.com/wdecoster/NanoPlot/issues/329). Weird. If you use --fastq_rich you will get more plots, as it also parses the time and channel information from the fastq description line. Do you hae a sequencing_summary file? Those should also work.

gosalone commented 10 months ago

I installed it via singularity (SIF docker), maybe is not fixed here? but it's weird because it's the last version I have.

I don't have the sequencing_summary file and wouldn't currently need the time and channel information. So I think I will currently use this version, the time to have an update for the container.

Thank You @wdecoster