Closed Umair1441 closed 1 year ago
Can you give a bit more background - how are you running it, in a conda environment? And what exactly is the command you are running?
Hy, yes I am using conda 23.5.2. when I run this code "slamdunk all -r ref.fa -b actb.bed -o output -rl 100 -mbq 27 -5 0 reads.fq" it creates the output directory with following foloders, 1. count 2, filter 3. map 4.snp and creates the different files in it.
Can you guite me whats is the next step..Thank you
Do the files exists in the directory you are running the command?
ref.fa
, actb.bed
, reads.fq
?
yes
Are there non-empty files created in the output folders?
no these are not non emply files. When I run it to my original data with this command " slamdunk all -r hg19/hg19.fa.gz -b hg19/Hg.bed -o output -rl 100 -mbq 27 -5 0 NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq.gz "
its gives the following error and just only creates the map folder in the output.
Creating output directory: output
Creating output directory: output/map
Running slamDunk map for 1 files (1 threads)
Traceback (most recent call last):
File "/home/umair/miniconda3/envs/slam2/bin/slamdunk", line 10, in
could you help me please
Can you run this command and see if it runs through?
ngm -r hg19/hg19.fa.gz -q NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq.gz -t 1 --no-progress --slam-seq 2 --max-polya 4 -l --rg-id 0 --rg-sm NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq:pulse:0 -o output/map/NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq_slamdunk_mapped.sam
My suspicion is that you need to unzip the fasta file
Yes Now I unzip my fasta file its gives the same error and when I run --- "ngm -r hg19/hg19.fa -q NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq -t 1 --no-progress --slam-seq 2 --max-polya 4 -l --rg-id 0 --rg-sm NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1:pulse:0 -o output/map/NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1_slamdunk_mapped.sam"----
Then it gives the following output my computer has 4GB of RAM..
[MAIN] NextGenMap 0.5.5 [MAIN] Startup : x64 (build Jul 15 2018 19:15:59) [MAIN] Starting time: 2023-08-18.11:00:09 [CONFIG] Parameter: --affine 0 --argos_min_score 0 --bin_size 2 --block_multiplier 2 --broken_pairs 0 --bs_cutoff 6 --bs_mapping 0 --cpu_threads 1 --dualstrand 1 --fast 0 --fast_pairing 0 --force_rlength_check 0 --format 1 --gap_extend_penalty 5 --gap_read_penalty 20 --gap_ref_penalty 20 --hard_clip 0 --keep_tags 0 --kmer 13 --kmer_min 0 --kmer_skip 2 --local 1 --match_bonus 10 --match_bonus_tc 2 --match_bonus_tt 10 --max_cmrs 2147483647 --max_equal 1 --max_insert_size 1000 --max_polya 4 --max_read_length 0 --min_identity 0.650000 --min_insert_size 0 --min_mq 0 --min_residues 0.500000 --min_score 0.000000 --mismatch_penalty 15 --mode 0 --no_progress 1 --no_unal 0 --ocl_threads 1 --output output/map/NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1_slamdunk_mapped.sam --overwrite 1 --pair_score_cutoff 0.900000 --paired 0 --parse_all 1 --pe_delimiter / --qry NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1.fq --qry_count -1 --qry_start 0 --ref hg19/hg19.fa --ref_mode -1 --rg_id 0 --rg_sm NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1:pulse:0 --sensitive 0 --silent_clip 0 --skip_mate_check 0 --skip_save 0 --slam_seq 2 --step_count 4 --strata 0 --topn 1 --trim5 0 --update_check 0 --very_fast 0 --very_sensitive 0 [NGM] Opening for output (SAM): output/map/NEA1_1_EKDL230012705-1A_HJKV3DSX7_L3_1_slamdunk_mapped.sam [SEQPROV] Reading encoded reference from hg19/hg19.fa-enc.2.ngm [SEQPROV] Reading 3137 Mbp from disk took 5.81s [PREPROCESS] Building reference table [PREPROCESS] Allocated 1 hashtable units (tableLocMax=2^32.000000, genomeSize=2^31.546856) [PREPROCESS] Building RefTable #0 (kmer length: 13, reference skip: 2) [PREPROCESS] Number of k-mers: 67108865 [PREPROCESS] Counting kmers took 111.01s [PREPROCESS] Average number of positions per prefix: 17.554293 [PREPROCESS] Index size: 335544325 byte (67108865 x 5) [PREPROCESS] Generating index took 4.12s Killed
I see that looks like you have too little memory - what's your machine's memory capacity?
My machine have 4 GB of RAM . Intel(R) Core(TM) i3-3110M CPU @ 2.40GHz
Ok that is a problem - for the human genome you need more memory
ok
Hy, I am trying to run slamdunk and it gives this Error: RuntimeError: One or more input files don't exist: ['ref.fa', 'files']