Closed Giuseppe1995 closed 2 years ago
Hi,
Sorry to learn you are experiencing difficulties with HiCUP. Have any BAM/SAM files been generated? Do they contain any alignments?
I have a few suggestions to try to solve the problem.
1). Try allocating more RAM to the job for it to run. (This is excessive, but try 20 Gb RAM - does that work?)
2) If not, are you able to map the FASTQ files to the reference genome using Bowtie2 only? This tells you whether Bowtie2 is working on your system.
3) Could you try installing the software without conda. I do not maintain the conda install myself and cannot debug HiCUP installed using conda. For more details on installation, please go to: https://www.bioinformatics.babraham.ac.uk/projects/hicup/read_the_docs/html/index.html#hicup-quick-start-guide
HiCUP is very easy to install. Basically you just need to download the Perl scripts and run them.
What system are you running this on?
Thanks.
Oh, the problem stood with bowtie2! I run bowtie2 --help
and I got this error
/srv/ngsdata/dalteriog/Tools/miniconda3/bin/bowtie2-align-s: error while loading shared libraries: libtbb.so.2: cannot open shared object file: No such file or directory (ERR): Description of arguments failed! Exiting now ...
which I resolved thanks to this issue. This was quite odd for me because I already run the whole hicup pipeline, but I was on another machine, on the same server. Thank you so much!
Hi, I am trying to run the hicup pipeline on some test data. I correctly truncated reads via ´hicup_truncater´, generating the files test_dataset1.trunc.fastq and test_dataset2.trunc.fastq (I am assuming that they refer to the forward and reverse readset, respectively). However, I'm getting an error when I run
hicup_mapper
hicup_mapper --bowtie2 $BOWTIE2 --index $BWT_IDX --format Illumina_1.5 --threads $NT test_dataset1.trunc.fastq test_dataset2.trunc.fastq
Here follows my variables explanation: $BOWTIE2 --> path to bowtie2 executable $BWT_IDX --> path to bowtie2 index (without trailing .X.bt2) $NT --> 8
I cannot develop in perl, but I tried to take a look at the script in the line 552:
cat /srv/ngsdata/dalteriog/Tools/miniconda3/bin/hicup_mapper | awk 'NR==552{print}'
open( FORWARD, $fileForward ) or die "Can't read \'$fileForward\' : $!";I am assuming that the problem is in the pairing of the produces mapping files, probably because they are not produced at all. Is there something I am forgetting?