Closed amine759 closed 4 months ago
So the problem is not you X
but your y
, as you can see here: y_true = to_tensor(y_true, device=self.device)
. I assume that your y
contains string labels.
Without digging too deep, I think the problem is an assumption in your custom transformer. You think you can transform the y
in there too, but transformers are only passed X
. If possible, I would recommend to label-encode your y
before passing it to the pipeline. If this doesn't work, you could subclass Pipeline
and make it label-encode your y
, as the pipeline sees both X
and y
.
@BenjaminBossan Hi, I actually had to sublass Pipeline, Thanks for the help!
I'm trying to wrap a pytorch model, the model expects some fasttext embeddings, For my wrapped model I want it to do the embeddings internally for a future purpose. Here is my code :
I get the following error, although I embed my X data, some internal code just tries to work on the X data and not the embeddings returned by transform method I have overread, what am I missing here? any help would be highly appreciated.