Closed EricLee0224 closed 2 years ago
Hi @EricLee0224 , if you didn't change the structure of the mvtec dataset, you should change your path like this CutPaste-Li/mvtec/bottle/train
hello! Thank you for your code contribution!I follow the setup instruction to run train.py , But I encountered the following problems:
$ python train.py --dataset_path CutPaste-Li/mvtec/bottle --num_class 3
Missing logger folder: tb_logs/exp1 Traceback (most recent call last): File "train.py", line 103, in model = CutPaste(hparams = args) File "train.py", line 19, in init self.model = CutPasteNet(encoder = hparams.encoder, pretrained = hparams.pretrained, dims = hparams.dims, num_class = hparams.num_class) File "/home/lwz/CutPaste-Li/model.py", line 49, in init super().init(encoder, pretrained, dims, num_class) File "/home/lwz/CutPaste-Li/model.py", line 23, in init self.out = nn.Linear(dims[-1], num_class) File "/home/lwz/.conda/envs/cutpasteli/lib/python3.8/site-packages/torch/nn/modules/linear.py", line 85, in init self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs)) TypeError: empty() received an invalid combination of arguments - got (tuple, dtype=NoneType, device=NoneType), but expected one of:
- (tuple of ints size, *, tuple of names names, torch.memory_format memory_format, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
- (tuple of ints size, *, torch.memory_format memory_format, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
I am really confused about that, could you please give me some advice? Is this a problem with the installation package version? Have you solved the problem yet? Can you give me some advice?
您好! 感谢您的代码贡献!我按照设置说明运行 train.py ,但遇到以下问题:
$ python train.py --dataset_path CutPaste-Li/mvtec/bottle --num_class 3
缺少记录器文件夹:tb_logs/exp1 Traceback(最近一次调用最后): 文件“train.py”,第 103 行,在 模型 = CutPaste(hparams = args) 文件“train.py”,第 19 行,在init self.model = CutPasteNet(encoder = hparams.encoder, pretrained = hparams.pretrained, dims = hparams.dims, num_class = hparams.num_class) 文件“/home/lwz/CutPaste-Li/model.py”,第 49 行,在init super() . init (encoder, pretrained, dims, num_class) File "/home/lwz/CutPaste-Li/model.py", line 23, init self.out = nn.Linear(dims[-1], num_class) File "/ home/lwz/.conda/envs/cutpasteli/lib/python3.8/site-packages/torch/nn/modules/linear.py”,第 85 行,** self.weight = Parameter(torch.empty((out_features, in_features), factory_kwargs)) TypeError: empty() 收到无效的参数组合 - 得到(元组,dtype=NoneType,device=NoneType),但预期其中之一:
- (整数元组大小,*,名称元组名称,torch.memory_format memory_format,torch.dtype dtype,torch.layout 布局,torch.device 设备,bool pin_memory,bool requires_grad)
- (整数大小的元组,*,torch.memory_format memory_format,Tensor out,torch.dtype dtype,torch.layout 布局,torch.device 设备,bool pin_memory,bool requires_grad)
我真的很困惑,你能给我一些建议吗? 这是安装包版本的问题吗? I have the same problem with you. Have you solved it?
hello! Thank you for your code contribution!I follow the setup instruction to run train.py , But I encountered the following problems:
$ python train.py --dataset_path CutPaste-Li/mvtec/bottle --num_class 3
Missing logger folder: tb_logs/exp1 Traceback (most recent call last): File "train.py", line 103, in
model = CutPaste(hparams = args)
File "train.py", line 19, in init
self.model = CutPasteNet(encoder = hparams.encoder, pretrained = hparams.pretrained, dims = hparams.dims, num_class = hparams.num_class)
File "/home/lwz/CutPaste-Li/model.py", line 49, in init
super().init(encoder, pretrained, dims, num_class)
File "/home/lwz/CutPaste-Li/model.py", line 23, in init
self.out = nn.Linear(dims[-1], num_class)
File "/home/lwz/.conda/envs/cutpasteli/lib/python3.8/site-packages/torch/nn/modules/linear.py", line 85, in init
self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
TypeError: empty() received an invalid combination of arguments - got (tuple, dtype=NoneType, device=NoneType), but expected one of:
I am really confused about that, could you please give me some advice? Is this a problem with the installation package version?