Open PHI6kai opened 1 year ago
you batchsize is so small,if you use cpu, set batchsize 32,solve it
you batchsize is so small,if you use cpu, set batchsize 32,solve it 您好,请问训练后存在workdir/models中的模型如何在demo_MLSD_flask.py中使用呢? 我运行后报错如下: Traceback (most recent call last): File "demo_MLSD_flask.py", line 296, in
init_worker(args) File "demo_MLSD_flask.py", line 255, in init_worker model = model_graph(args) File "demo_MLSD_flask.py", line 86, in init self.model = self.load(args.model_dir, args.model_type) File "demo_MLSD_flask.py", line 105, in load torch_model.load_state_dict(torch.load(model_path, map_location=device), strict=True) File "C:\Users\ai\AppData\Roaming\Python\Python38\site-packages\torch\nn\modules\module.py", line 1671, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for MobileV2_MLSD_Tiny: Unexpected key(s) in state_dict: "block17.weight", "block17.bias". size mismatch for backbone.features.0.0.weight: copying a param with shape torch.Size([32, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([32, 4, 3, 3]).
you batchsize is so small,if you use cpu, set batchsize 32,solve it 您好,请问训练后存在workdir/models中的模型如何在demo_MLSD_flask.py中使用呢? 我运行后报错如下: Traceback (most recent call last): File "demo_MLSD_flask.py", line 296, in init_worker(args) File "demo_MLSD_flask.py", line 255, in init_worker model = model_graph(args) File "demo_MLSD_flask.py", line 86, in init self.model = self.load(args.model_dir, args.model_type) File "demo_MLSD_flask.py", line 105, in load torch_model.load_state_dict(torch.load(model_path, map_location=device), strict=True) File "C:\Users\ai\AppData\Roaming\Python\Python38\site-packages\torch\nn\modules\module.py", line 1671, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for MobileV2_MLSD_Tiny: Unexpected key(s) in state_dict: "block17.weight", "block17.bias". size mismatch for backbone.features.0.0.weight: copying a param with shape torch.Size([32, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([32, 4, 3, 3]).
Bro, I am facing the issue, please help in resolving this issue.
I would like to ask if my dataset is not done properly or if something else is not done properly, because my dataset is made according to wireframe![image](https://github.com/lhwcv/mlsd_pytorch/assets/50110789/cfe3e811-695d-4c4f-9e2a-f28318591586)