snijderlab / stitch

Template-based assembly of proteomics short reads for de novo antibody sequencing and repertoire profiling
MIT License
22 stars 3 forks source link

attempting to run with fasta file of peptides #154

Closed avilella closed 2 years ago

avilella commented 2 years ago

I've created at batchfile like the one in monoclonal.txt, but changing the top part to this:

-Run Info---------------
Version    : 1.0
Runname    : MASS003

- Here the input can be defined, this will be used in the TemplateMatching and Recombine steps
Input ->
    Fasta ->
        Path     : ..\datasets\P1186_inv_clgGLB.fasta
        Name     : 01
    <-
<-

Here is how my fasta file looks like, with about 150 entries:

>1
VDKPVPK
>2
DTLLIAR
>3
DTLLIAR
>4
DTLLIAR
>5
DTLLIAR
>6
DTLLIAR
>7
ALPSPIER
>8
ALPSPIER
>9
ALPSPIER
>10
ALPSPIER
>11
ALPSPIER
>12
ALPSPIER
>13
TKVDKPVPK
>14
TKVDKPVPK
>15
TKVDKPVPK
>16
VPRPPDCPK
>17
VPRPPDCPK
>18
VPRPPDCPK

When I run stitch in Windows, I get the following error:

C:\Users\Albert Vilella\Downloads\stitch-v1.0.0-windows>.\stitch.exe batchfiles\MASS003.txt
>                                                                                                                                                                         |   0%  30 ms------------------------------------------>                                                                                                                               |  25% 297 ms------------------------------------------>                                                                                                                               |  25%  1.0 sERROR: One or more errors occurred. (capacity was less than the current size. (Parameter 'value'))
STACKTRACE:    at System.Threading.Tasks.TaskReplicator.Run[TState](ReplicatableUserAction`1 , ParallelOptions , Boolean )
   at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 , Int32 , ParallelOptions , Action`1 , Action`2 , Func`4 , Func`1 , Action`1 )
--- End of stack trace from previous location ---
   at System.Threading.Tasks.Parallel.ThrowSingleCancellationExceptionOrOtherException(ICollection , CancellationToken , Exception )
   at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 , Int32 , ParallelOptions , Action`1 , Action`2 , Func`4 , Func`1 , Action`1 )
   at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IList`1 , ParallelOptions , Action`1 , Action`2 , Action`3 , Func`4 , Func`5 , Func`1 , Action`1 )
   at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable`1 , ParallelOptions , Action`1 , Action`2 , Action`3 , Func`4 , Func`5 , Func`1 , Action`1 )
   at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable`1 , ParallelOptions , Action`1 )
   at AssemblyNameSpace.RunParameters.SingleRun.RunTemplateMatching()
   at AssemblyNameSpace.RunParameters.SingleRun.Calculate()
   at AssemblyNameSpace.ToRunWithCommandLine.RunBatchFile(String filename)
   at AssemblyNameSpace.ToRunWithCommandLine.Main()
SOURCE: System.Threading.Tasks.Parallel
TARGET: Void Run[TState](ReplicatableUserAction`1, System.Threading.Tasks.ParallelOptions, Boolean)

C:\Users\Albert Vilella\Downloads\stitch-v1.0.0-windows>
douweschulte commented 2 years ago

Thanks so much for the error report. I have seen this error message before, somewhere deep in the multithreaded code there is a small bug. I will try to find and fic it in the next while, but in the meantime it tends to work every couple of times you try. Or you could try and set the maximal number of cores used lower to prevent multithreading issues. This can be done with the MaxCores key in the configuration file, which can be positioned the line below the Runname.

douweschulte commented 2 years ago

I fixed the exact error you reported, so the program cannot crash for the same reason. If it still crashes feel free to sent another bug report. You can find the newest version of this program with this fix applied here: https://github.com/snijderlab/stitch/actions/runs/2114100635. This fix will also be in place in the next release.

Thanks again for the bug report!