Closed binga closed 4 years ago
Predict is not intended to be used on a DataLoader
but one row of a dataframe. Did you mean to use learn.get_preds(dl=test_dl)
?
Ouch. My bad.
Yes, I'd like to get the predictions of all the test rows and now when I switch to get_preds
, I encounter the following error on this gist - https://gist.github.com/binga/2fabdc620a3a97b9d218b366df41b94c
Your are not building your DataLoaders
with the fastai methods, this is why it fails. At the very leasy, use TabularDataLoaders
instead of DataLoaders
, which should help, but I'm not 100% confident it will work since you are not using TabularPandas.dataloaders
.
Oh yes. When I wrap my dataloaders with TabularDataLoaders
, it works!!
I was trying to have a different batch size in train and test and hence I tried the following:
trn_dl = TabDataLoader(to.train, bs=256, shuffle=True, drop_last=True)
val_dl = TabDataLoader(to.valid, bs=512)
dls = DataLoaders(trn_dl, val_dl)
Changing this to
trn_dl = TabDataLoader(to.train, bs=256, shuffle=True, drop_last=True)
val_dl = TabDataLoader(to.valid, bs=512)
dls = TabularDataLoaders(trn_dl, val_dl)
fixes the problem.
This works perfectly fine in CPU mode. However, when I add device='cuda'
in TabularPandas
, the model fails when I call learn.summary()
or learn.lr_find()
when I use your suggestion.
I can paste the gist in a few min.
Edit: Updated gist is here - https://gist.github.com/binga/2fabdc620a3a97b9d218b366df41b94c
It works if I do learn.model.to('cuda')
Looks like the user has to explicitly put the model on the GPU. Am I missing anything here?
Please use the forum for debugging your code, we keep the issues for bugs only and the wide community is there, not here.
I am trying the Tabular learner and during predict phase, even if I input the same training data, the following
learn.predict
function throws up an error.The error trace gist is here: https://gist.github.com/binga/2fabdc620a3a97b9d218b366df41b94c.
Am I missing any params or configuration settings?
P.S: This is on a Win10 machine.