I am making a phospho spec library and DIA-NN is crashing every time I do it. Here are the parameters I am using:
Missed cleavage: 1
Max variable modifications:3, with only phospho and C- carbamidomethylation checked.
Precursor charge: 2-3
pep length: 7-30
I am also getting this error:
Libtorch error:
The following operation failed in the TorchScript interpreter.
Traceback of TorchScript, serialized code (most recent call last):
File "code/torch.py", line 51, in forward
hidden = torch.zeros([_9, _10, hidden_size], dtype=None, layout=None, device=torch.device(device0))
rnn = self.rnn
output, hidden0, = (rnn).forward__0(conv1, hidden, )
prepare = self.prepare
prepare0 = torch.rrelu((prepare).forward(output, ))
File "code/__torch__/torch/nn/modules/rnn.py", line 50, in forward__0
_flat_weights = self._flat_weights
training = self.training
_3, _4 = torch.gru(input, hx0, _flat_weights, True, 2, 0.29999999999999999, training, True, True)
~~~~~~~~~ <--- HERE
_5 = (_3, (self).permute_hidden(_4, None, ))
return _5
Traceback of TorchScript, original code (most recent call last):
File "<ipython-input-16-0dd9b29dfda4>", line 32, in forward
hidden = torch.zeros(self.directions * self.layers, input.size(0), self.hidden_size, device = self.device)
output, hidden = self.rnn(conv, hidden)
~~~~~~~~ <--- HERE
prepare = torch.rrelu(self.prepare(output))
out = self.out(prepare)
File "C:\ProgramData\Anaconda3\lib\site-packages\torch\nn\modules\rnn.py", line 849, in forward__0
self.check_forward_args(input, hx, batch_sizes)
if batch_sizes is None:
result = _VF.gru(input, hx, self._flat_weights, self.bias, self.num_layers,
~~~~~~~ <--- HERE
self.dropout, self.training, self.bidirectional, self.batch_first)
else:
I am making a phospho spec library and DIA-NN is crashing every time I do it. Here are the parameters I am using: Missed cleavage: 1 Max variable modifications:3, with only phospho and C- carbamidomethylation checked. Precursor charge: 2-3 pep length: 7-30
I am also getting this error:
Libtorch error: The following operation failed in the TorchScript interpreter. Traceback of TorchScript, serialized code (most recent call last): File "code/torch.py", line 51, in forward hidden = torch.zeros([_9, _10, hidden_size], dtype=None, layout=None, device=torch.device(device0)) rnn = self.rnn output, hidden0, = (rnn).forward__0(conv1, hidden, )