Closed agoel00 closed 5 years ago
Did you mange to solve this?
I was able to bypass this issue by downgrading the PyTorch version in Google Colab environment. Apparently it doesn't work with Pytorch 1.1 which Colab provides currently.
Downgrading to 1.0.1 seems to solve the problem.
I'm trying to get 2304 fixed length vector embeddings for a set of tweets for my dataset.
I'm using Google Colaboratory(Python 3) and I get this error with this command
!python3 examples/text_emojize.py --text f"This is the shit!"
as well as using this
encoding = model(tokenized)