Alibaba-MIIL / ML_Decoder

Official PyTorch implementation of "ML-Decoder: Scalable and Versatile Classification Head" (2021)
MIT License
315 stars 52 forks source link

How to inference on cpu? #42

Closed Bru-Souza closed 2 years ago

Bru-Souza commented 2 years ago

dear, is it possible to inference the model without gpu? How to do it?

mrT23 commented 2 years ago

transform the model first using the InplacABN_to_ABN fuse_bn2d_bn1d_abn functions in: https://github.com/Alibaba-MIIL/Solving_ImageNet/blob/7eb8e3b17f595eb42b7e80f70ef799b9af0f43a8/kd/kd_utils.py#L21

then it will be suitable also for cpu

victoic commented 2 years ago

Hello! I'm encountering the same issue with inference on cpu.

model_test = InplacABN_to_ABN(model_test)
model_test = fuse_bn_recursively(model_test)
model_test = model.cpu().eval()

After transforming the model I encounter the following error, as if the model remains in cuda. Even more strangely, it seems to happen at an F.pad function of the antialiasing class.


RuntimeError Traceback (most recent call last)

in () 42 print("Will show!") 43 show_img = True ---> 44 pred = infer(args, model_test, classes_list, gt, show_img) 45 conf_mat[gt][pred]+=1 46 #print(pred, gt, pred == gt)

13 frames

in infer(args, model, classes_list, gt, show_img) 32 tensor_img = torch.from_numpy(np_img).permute(2, 0, 1).float() / 255.0 # HWC to CHW 33 tensor_batch = torch.unsqueeze(tensor_img, 0).cpu() # float16 inference ---> 34 output = torch.squeeze(torch.sigmoid(model(tensor_batch))) 35 np_output = output.cpu().detach().numpy() 36

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/content/ML_Decoder/src_files/models/tresnet/tresnet.py in forward(self, x) 202 203 def forward(self, x): --> 204 x = self.body(x) 205 self.embeddings = self.global_pool(x) 206 logits = self.head(self.embeddings)

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/container.py in forward(self, input) 137 def forward(self, input): 138 for module in self: --> 139 input = module(input) 140 return input 141

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/container.py in forward(self, input) 137 def forward(self, input): 138 for module in self: --> 139 input = module(input) 140 return input 141

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/content/ML_Decoder/src_files/models/tresnet/tresnet.py in forward(self, x) 118 119 out = self.conv1(x) --> 120 out = self.conv2(out) 121 if self.se is not None: out = self.se(out) 122

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/container.py in forward(self, input) 137 def forward(self, input): 138 for module in self: --> 139 input = module(input) 140 return input 141

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, *kwargs) 1049 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1050 or _global_forward_hooks or _global_forward_pre_hooks): -> 1051 return forward_call(input, **kwargs) 1052 # Do not call functions when jit is used 1053 full_backward_hooks, non_full_backward_hooks = [], []

/content/ML_Decoder/src_files/models/tresnet/layers/anti_aliasing.py in forward(self, x) 16 17 def forward(self, x): ---> 18 return self.op(x) 19 20

/content/ML_Decoder/src_files/models/tresnet/layers/anti_aliasing.py in call(self, input) 38 self.filt.to(input.device) 39 self.filt = self.filt.float() ---> 40 input_pad = F.pad(input, (1, 1, 1, 1), 'reflect') 41 return F.conv2d(input_pad, self.filt, stride=2, padding=0, groups=input.shape[1]) 42

RuntimeError: Input type (torch.FloatTensor) and weight type (torch.cuda.FloatTensor) should be the same or input should be a MKLDNN tensor and weight is a dense tensor