PengNi / ccsmeth

Detecting DNA methylation from PacBio CCS reads
BSD 3-Clause Clear License
71 stars 10 forks source link

Modify the number of the layers #36

Open RahelehSalehi opened 1 year ago

RahelehSalehi commented 1 year ago

Hi, Thanks for your code again. I changed the number of layers from 3 to 1 for my data. and when I want to test the retrained model, I got this error. "RuntimeError: Error(s) in loading state_dict for ModelAttRNN: Missing key(s) in state_dict: "rnn.weight_ih_l1", "rnn.weight_hh_l1", "rnn.bias_ih_l1", "rnn.bias_hh_l1", "rnn.weight_ih_l1_reverse", "rnn.weight_hh_l1_reverse", "rnn.bias_ih_l1_reverse", "rnn.bias_hh_l1_reverse". Unexpected key(s) in state_dict: "fc.weight", "fc.bias". "

it seems that the model doesn't have any fc layer. could you please help me out how I can solve the problem? Thanks a lot

PengNi commented 1 year ago

Hi, thanks for trying ccsmeth. What are your commands to train and test the model? If you set --layer_rnn to 1 in both commands, there shouldn't be errors.

fka21 commented 5 months ago

Any updates on this issue? I seem to have faced the same hurdle when I try to run it with call_mods, default settings.

The error message:

`2024-04-26 17:22:10 - INFO - extract_features process-1707028 starts Process Process-6: Traceback (most recent call last): File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/site-packages/ccsmeth/call_modifications.py", line 347, in _call_mods_q model.load_state_dict(model_dict) File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2152, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for ModelAttRNN: size mismatch for embed.weight: copying a param with shape torch.Size([16, 4]) from checkpoint, the shape in current model is torch.Size([5, 8]). size mismatch for rnn.weight_ih_l0: copying a param with shape torch.Size([768, 7]) from checkpoint, the shape in current model is torch.Size([768, 11]). size mismatch for rnn.weight_ih_l0_reverse: copying a param with shape torch.Size([768, 7]) from checkpoint, the shape in current model is torch.Size([768, 11]).

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/site-packages/ccsmeth/call_modifications.py", line 356, in _call_mods_q model.load_state_dict(para_dict_new) File "/home/ferenc.kagan/miniconda3/envs/ccsmeth/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2152, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for ModelAttRNN: Missing key(s) in state_dict: "embed.weight", "rnn.weight_ih_l0", "rnn.weight_hh_l0", "rnn.bias_ih_l0", "rnn.bias_hh_l0", "rnn.weight_ih_l0_reverse", "rnn.weight_hh_l0_reverse", "rnn.bias_ih_l0_reverse", "rnn.bias_hh_l0_reverse", "rnn.weight_ih_l1", "rnn.weight_hh_l1", "rnn.bias_ih_l1", "rnn.bias_hh_l1", "rnn.weight_ih_l1_reverse", "rnn.weight_hh_l1_reverse", "rnn.bias_ih_l1_reverse", "rnn.bias_hh_l1_reverse", "rnn.weight_ih_l2", "rnn.weight_hh_l2", "rnn.bias_ih_l2", "rnn.bias_hh_l2", "rnn.weight_ih_l2_reverse", "rnn.weight_hh_l2_reverse", "rnn.bias_ih_l2_reverse", "rnn.bias_hh_l2_reverse", "fc1.weight", "fc1.bias", "_att3.Wa.weight", "_att3.Ua.weight", "_att3.va.weight". Unexpected key(s) in state_dict: "eight", "ght_ih_l0", "ght_hh_l0", "s_ih_l0", "s_hh_l0", "ght_ih_l0_reverse", "ght_hh_l0_reverse", "s_ih_l0_reverse", "s_hh_l0_reverse", "ght_ih_l1", "ght_hh_l1", "s_ih_l1", "s_hh_l1", "ght_ih_l1_reverse", "ght_hh_l1_reverse", "s_ih_l1_reverse", "s_hh_l1_reverse", "ght_ih_l2", "ght_hh_l2", "s_ih_l2", "s_hh_l2", "ght_ih_l2_reverse", "ght_hh_l2_reverse", "s_ih_l2_reverse", "s_hh_l2_reverse", "ght", "s", "a.weight". `

Any help would be greatly appreciated.