walkwithfastai / walkwithfastai.github.io

Host for https://walkwithfastai.com
Other
143 stars 53 forks source link

Model can't predict the data because of device mismatch #45

Closed sakibsadmanshajib closed 2 years ago

sakibsadmanshajib commented 3 years ago

in file https://github.com/walkwithfastai/walkwithfastai.github.io/blob/ffcc0b98c2b4d62777e042cb722a5f47c4f40702/nbs/03_tab.ae.ipynb

Section: Getting the compressed representations

outs = []
for batch in dl:
    with torch.no_grad():
        learn.model.eval()
        learn.model.cuda()
        out = learn.model(*batch[:2], encode=True).cpu().numpy()
        outs.append(out)
outs = np.concatenate(outs)
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-29-42e3a106393f> in <module>()
      4         learn.model.eval()
      5         learn.model.cuda()
----> 6         out = learn.model(*batch[:2], encode=True).cpu().numpy()
      7         outs.append(out)
      8 outs = np.concatenate(outs)

6 frames
/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
   1049         if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1050                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1051             return forward_call(*input, **kwargs)
   1052         # Do not call functions when jit is used
   1053         full_backward_hooks, non_full_backward_hooks = [], []

<ipython-input-23-b5be2cb63886> in forward(self, x_cat, x_cont, encode)
     28             x_cat = self.noise(x_cat)
     29             x_cont = self.noise(x_cont)
---> 30         encoded = super().forward(x_cat, x_cont)
     31         if encode: return encoded # return the representation
     32         decoded_trunk = self.decoder(encoded)

/usr/local/lib/python3.7/dist-packages/fastai/tabular/model.py in forward(self, x_cat, x_cont)
     47     def forward(self, x_cat, x_cont=None):
     48         if self.n_emb != 0:
---> 49             x = [e(x_cat[:,i]) for i,e in enumerate(self.embeds)]
     50             x = torch.cat(x, 1)
     51             x = self.emb_drop(x)

/usr/local/lib/python3.7/dist-packages/fastai/tabular/model.py in <listcomp>(.0)
     47     def forward(self, x_cat, x_cont=None):
     48         if self.n_emb != 0:
---> 49             x = [e(x_cat[:,i]) for i,e in enumerate(self.embeds)]
     50             x = torch.cat(x, 1)
     51             x = self.emb_drop(x)

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
   1049         if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1050                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1051             return forward_call(*input, **kwargs)
   1052         # Do not call functions when jit is used
   1053         full_backward_hooks, non_full_backward_hooks = [], []

/usr/local/lib/python3.7/dist-packages/torch/nn/modules/sparse.py in forward(self, input)
    158         return F.embedding(
    159             input, self.weight, self.padding_idx, self.max_norm,
--> 160             self.norm_type, self.scale_grad_by_freq, self.sparse)
    161 
    162     def extra_repr(self) -> str:

/usr/local/lib/python3.7/dist-packages/torch/nn/functional.py in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
   2041         # remove once script supports set_grad_enabled
   2042         _no_grad_embedding_renorm_(weight, input, max_norm, norm_type)
-> 2043     return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
   2044 
   2045 

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking arugment for argument index in method wrapper_index_select)
sakibsadmanshajib commented 3 years ago

@muellerzr I'd really appreciate the help here.